first-steps-with-celery.rst 6.4 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211
  1. .. _tut-celery:
  2. ========================
  3. First steps with Celery
  4. ========================
  5. .. contents::
  6. :local:
  7. .. _celerytut-broker:
  8. Choosing a Broker
  9. =================
  10. Celery requires a solution to send and receive messages, usually this
  11. comes in the form of a separate service called a *message broker*.
  12. There are several choices available, including:
  13. * :ref:`broker-rabbitmq`
  14. `RabbitMQ`_ is feature-complete, stable, durable and easy to install.
  15. * :ref:`broker-redis`
  16. `Redis`_ is also feature-complete, but is more susceptible to data loss in
  17. the event of abrupt termination or power failures.
  18. * :ref:`broker-sqlalchemy`
  19. * :ref:`broker-django`
  20. Using a database as a message queue is not recommended, but can be sufficient
  21. for very small installations. Celery can use the SQLAlchemy and Django ORM.
  22. * and more.
  23. In addition to the above, there are several other transport implementations
  24. to choose from, including :ref:`broker-couchdb`, :ref:`broker-beanstalk`,
  25. :ref:`broker-mongodb`, and SQS. There is a `Transport Comparison`_
  26. in the Kombu documentation.
  27. .. _`RabbitMQ`: http://www.rabbitmq.com/
  28. .. _`Redis`: http://redis.io/
  29. .. _`Transport Comparison`: http://kombu.rtfd.org/transport-comparison
  30. .. _celerytut-conf:
  31. Application
  32. ===========
  33. The first thing you need is a Celery instance, this is called the celery
  34. application or just app. Since this instance is used as
  35. the entry-point for everything you want to do in Celery, like creating tasks and
  36. managing workers, it must be possible for other modules to import it.
  37. Some people create a dedicated module for it, but in this tutorial we will
  38. keep everything in the same module.
  39. Let's create the file :file:`tasks.py`:
  40. .. code-block:: python
  41. from celery import Celery
  42. celery = Celery("tasks", broker="amqp://guest:guest@localhost:5672")
  43. @celery.task
  44. def add(x, y):
  45. return x + y
  46. if __name__ == "__main__":
  47. celery.celery_main()
  48. The first argument to :class:`Celery` is the name of the current module,
  49. this is needed to that names can be automatically generated, the second
  50. argument is the broker keyword argument which specifies the URL of the
  51. message broker we want to use.
  52. We defined a single task, called ``add``, which returns the sum of two numbers.
  53. .. _celerytut-running-celeryd:
  54. Running the celery worker server
  55. ================================
  56. We can now run the worker by executing our program with the ``worker``
  57. argument::
  58. $ python tasks.py worker --loglevel=INFO
  59. In production you will probably want to run the worker in the
  60. background as a daemon. To do this you need to use the tools provided
  61. by your platform, or something like `supervisord`_ (see :ref:`daemonizing`
  62. for more information).
  63. For a complete listing of the command line options available, do::
  64. $ python tasks.py worker --help
  65. There also several other commands available, and similarly you can get a list
  66. of these::
  67. $ python tasks.py --help
  68. .. _`supervisord`: http://supervisord.org
  69. .. _celerytut-executing-task:
  70. Executing the task
  71. ==================
  72. Whenever we want to execute our task, we use the
  73. :meth:`~celery.task.base.Task.delay` method of the task class.
  74. This is a handy shortcut to the :meth:`~celery.task.base.Task.apply_async`
  75. method which gives greater control of the task execution (see
  76. :ref:`guide-executing`).
  77. >>> from tasks import add
  78. >>> add.delay(4, 4)
  79. The task should now be executed by the worker you started earlier,
  80. and you can verify that by looking at the workers console output.
  81. Applying a task returns an :class:`~celery.result.AsyncResult` instance,
  82. which can be bused to check the state of the task, wait for the task to finish
  83. or get its return value (or if the task failed, the exception and traceback).
  84. But results aren't enabled by default, to enable it you have configure
  85. Celery to use a result backend, which is detailed in the next section.
  86. .. _celerytut-keeping-results:
  87. Keeping Results
  88. ---------------
  89. If you want to keep track of the tasks state, Celery needs to store or send
  90. the states somewhere. There are several
  91. built-in backends to choose from: SQLAlchemy/Django ORM, Memcached, Redis,
  92. AMQP, MongoDB, Tokyo Tyrant and Redis -- or you can define your own.
  93. For this example we will use the `amqp` result backend, which sends states
  94. as messages. The backend is configured via the :setting:`CELERY_RESULT_BACKEND`
  95. setting or using the ``backend`` argument to :class:`Celery`, in addition individual
  96. result backends may have additional settings
  97. you can configure::
  98. from celery.backends.amqp import AMQPBackend
  99. celery = Celery(backend=AMQPBackend(expires=300))
  100. To read more about result backends please see :ref:`task-result-backends`.
  101. Now with the result backend configured, let's execute the task again.
  102. This time we'll hold on to the :class:`~celery.result.AsyncResult`::
  103. >>> result = add.delay(4, 4)
  104. Here's some examples of what you can do when you have results::
  105. >>> result.ready() # returns True if the task has finished processing.
  106. False
  107. >>> result.result # task is not ready, so no return value yet.
  108. None
  109. >>> result.get() # Waits until the task is done and returns the retval.
  110. 8
  111. >>> result.result # direct access to result, doesn't re-raise errors.
  112. 8
  113. >>> result.successful() # returns True if the task didn't end in failure.
  114. True
  115. If the task raises an exception, the return value of `result.successful()`
  116. will be :const:`False`, and `result.result` will contain the exception instance
  117. raised by the task.
  118. .. _celerytut-configuration:
  119. Configuration
  120. -------------
  121. Celery is very flexible and comes with many configuration options that
  122. can be set on your app directly, or by using dedicated configuration files.
  123. For example you can set the default value for the workers
  124. `--concurrency`` argument, which is used to decide the number of pool worker
  125. processes, the name for this setting is :setting:`CELERYD_CONCURRENCY`:
  126. .. code-block:: python
  127. celery.conf.CELERY_CONCURRENCY = 10
  128. If you are configuring many settings then one practice is to have a separate module
  129. containing the configuration. You can tell your Celery instance to use
  130. this module, historically called ``celeryconfig.py``, with the
  131. :meth:`config_from_obj` method:
  132. .. code-block:: python
  133. celery.config_from_object("celeryconfig")
  134. For a complete reference of configuration options, see :ref:`configuration`.
  135. Where to go from here
  136. =====================
  137. After this you should read the :ref:`guide`. Specifically
  138. :ref:`guide-tasks` and :ref:`guide-executing`.