first-steps-with-celery.rst 7.3 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232
  1. .. _tut-celery:
  2. .. _first-steps:
  3. ========================
  4. First steps with Celery
  5. ========================
  6. .. contents::
  7. :local:
  8. .. _celerytut-broker:
  9. Choosing a Broker
  10. =================
  11. Celery requires a solution to send and receive messages, usually this
  12. comes in the form of a separate service called a *message broker*.
  13. There are several choices available, including:
  14. * :ref:`broker-rabbitmq`
  15. `RabbitMQ`_ is feature-complete, stable, durable and easy to install.
  16. * :ref:`broker-redis`
  17. `Redis`_ is also feature-complete, but is more susceptible to data loss in
  18. the event of abrupt termination or power failures.
  19. * :ref:`broker-sqlalchemy`
  20. * :ref:`broker-django`
  21. Using a database as a message queue is not recommended, but can be sufficient
  22. for very small installations. Celery can use the SQLAlchemy and Django ORM.
  23. * and more.
  24. In addition to the above, there are several other transport implementations
  25. to choose from, including :ref:`broker-couchdb`, :ref:`broker-beanstalk`,
  26. :ref:`broker-mongodb`, and SQS. There is a `Transport Comparison`_
  27. in the Kombu documentation.
  28. .. _`RabbitMQ`: http://www.rabbitmq.com/
  29. .. _`Redis`: http://redis.io/
  30. .. _`Transport Comparison`: http://kombu.rtfd.org/transport-comparison
  31. .. _celerytut-conf:
  32. Application
  33. ===========
  34. The first thing you need is a Celery instance, this is called the celery
  35. application or just app. Since this instance is used as
  36. the entry-point for everything you want to do in Celery, like creating tasks and
  37. managing workers, it must be possible for other modules to import it.
  38. Some people create a dedicated module for it, but in this tutorial we will
  39. keep everything in the same module.
  40. Let's create the file :file:`tasks.py`:
  41. .. code-block:: python
  42. from celery import Celery
  43. celery = Celery("tasks", broker="amqp://guest:guest@localhost:5672")
  44. @celery.task
  45. def add(x, y):
  46. return x + y
  47. if __name__ == "__main__":
  48. celery.start()
  49. The first argument to :class:`~celery.app.Celery` is the name of the current module,
  50. this is needed so that names can be automatically generated, the second
  51. argument is the broker keyword argument which specifies the URL of the
  52. message broker we want to use.
  53. We defined a single task, called ``add``, which returns the sum of two numbers.
  54. .. _celerytut-running-celeryd:
  55. Running the celery worker server
  56. ================================
  57. We can now run the worker by executing our program with the ``worker``
  58. argument::
  59. $ python tasks.py worker --loglevel=INFO
  60. In production you will probably want to run the worker in the
  61. background as a daemon. To do this you need to use the tools provided
  62. by your platform, or something like `supervisord`_ (see :ref:`daemonizing`
  63. for more information).
  64. For a complete listing of the command line options available, do::
  65. $ python tasks.py worker --help
  66. There also several other commands available, and similarly you can get a list
  67. of these::
  68. $ python tasks.py --help
  69. .. _`supervisord`: http://supervisord.org
  70. .. _celerytut-executing-task:
  71. Executing the task
  72. ==================
  73. Whenever we want to execute our task, we use the
  74. :meth:`~@Task.delay` method of the task.
  75. This is a handy shortcut to the :meth:`~@Task.apply_async`
  76. method which gives greater control of the task execution (see
  77. :ref:`guide-executing`).
  78. >>> from tasks import add
  79. >>> add.delay(4, 4)
  80. The task should now be executed by the worker you started earlier,
  81. and you can verify that by looking at the workers console output.
  82. Applying a task returns an :class:`~@AsyncResult` instance,
  83. which can be used to check the state of the task, wait for the task to finish
  84. or get its return value (or if the task failed, the exception and traceback).
  85. But this isn't enabled by default, and you have to configure Celery to
  86. use a result backend, which is detailed in the next section.
  87. .. _celerytut-keeping-results:
  88. Keeping Results
  89. ---------------
  90. If you want to keep track of the tasks state, Celery needs to store or send
  91. the states somewhere. There are several
  92. built-in backends to choose from: SQLAlchemy/Django ORM, Memcached, Redis,
  93. AMQP, MongoDB, Tokyo Tyrant and Redis -- or you can define your own.
  94. For this example we will use the `amqp` result backend, which sends states
  95. as messages. The backend is configured via the :setting:`CELERY_RESULT_BACKEND`
  96. setting or using the ``backend`` argument to :class:`Celery`, in addition individual
  97. result backends may have additional settings
  98. you can configure::
  99. from celery.backends.amqp import AMQPBackend
  100. celery = Celery(backend=AMQPBackend(expires=300))
  101. To read more about result backends please see :ref:`task-result-backends`.
  102. Now with the result backend configured, let's execute the task again.
  103. This time we'll hold on to the :class:`~@AsyncResult`::
  104. >>> result = add.delay(4, 4)
  105. Here's some examples of what you can do when you have results::
  106. >>> result.ready() # returns True if the task has finished processing.
  107. False
  108. >>> result.result # task is not ready, so no return value yet.
  109. None
  110. >>> result.get() # Waits until the task is done and returns the retval.
  111. 8
  112. >>> result.result # direct access to result, doesn't re-raise errors.
  113. 8
  114. >>> result.successful() # returns True if the task didn't end in failure.
  115. True
  116. If the task raises an exception, the return value of :meth:`~@AsyncResult.successful`
  117. will be :const:`False`, and `result.result` will contain the exception instance
  118. raised by the task.
  119. .. _celerytut-configuration:
  120. Configuration
  121. -------------
  122. Celery, like a consumer appliance doesn't need much to be operated.
  123. It has an input and an output, where you must connect the input to a broker and maybe
  124. the output to a result backend if so wanted. But if you look closely at the back
  125. there is a lid revealing lots of sliders, dials and buttons: this is the configuration.
  126. The default configuration should be good enough for most uses, but there
  127. are many things to tweak so that Celery works just the way you want it to.
  128. Reading about the options available is a good idea to get familiar with what
  129. can be configured, see the :ref:`configuration` reference.
  130. The configuration can be set on the app directly (but not all at runtime)
  131. or by using a dedicated configuration module.
  132. As an example you can set the default value for the workers
  133. ``--concurrency`` argument, which is used to decide the number of pool worker
  134. processes, by changing the :setting:`CELERYD_CONCURRENCY` setting:
  135. .. code-block:: python
  136. celery.conf.CELERY_CONCURRENCY = 10
  137. If you are configuring many settings then one practice is to have a separate module
  138. containing the configuration. You can tell your Celery instance to use
  139. this module, historically called ``celeryconfig.py``, with the
  140. :meth:`config_from_obj` method:
  141. .. code-block:: python
  142. celery.config_from_object("celeryconfig")
  143. A module named ``celeryconfig.py`` must then be available to load from the
  144. current directory or on the Python path, it could look like this:
  145. :file:`celeryconfig.py`::
  146. CELERY_CONCURRENCY = 10
  147. To verify that your configuration file works properly, and does't
  148. contain any syntax errors, you can try to import it::
  149. $ python -m celeryconfig
  150. For a complete reference of configuration options, see :ref:`configuration`.
  151. Where to go from here
  152. =====================
  153. After this you should read the :ref:`guide`. Specifically
  154. :ref:`guide-tasks` and :ref:`guide-executing`.