first-steps-with-celery.rst 6.1 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201
  1. .. _tut-celery:
  2. ========================
  3. First steps with Celery
  4. ========================
  5. .. contents::
  6. :local:
  7. .. _celerytut-simple-tasks:
  8. Creating a simple task
  9. ======================
  10. In this tutorial we are creating a simple task that adds two
  11. numbers. Tasks are defined in normal Python modules.
  12. By convention we will call our module :file:`tasks.py`, and it looks
  13. like this:
  14. :file: `tasks.py`
  15. .. code-block:: python
  16. from celery.task import task
  17. @task
  18. def add(x, y):
  19. return x + y
  20. Behind the scenes the `@task` decorator actually creates a class that
  21. inherits from :class:`~celery.task.base.Task`. The best practice is to
  22. only create custom task classes when you want to change generic behavior,
  23. and use the decorator to define tasks.
  24. .. seealso::
  25. The full documentation on how to create tasks and task classes is in the
  26. :doc:`../userguide/tasks` part of the user guide.
  27. .. _celerytut-conf:
  28. Configuration
  29. =============
  30. Celery is configured by using a configuration module. By default
  31. this module is called :file:`celeryconfig.py`.
  32. The configuration module must either be in the current directory
  33. or on the Python path, so that it can be imported.
  34. You can also set a custom name for the configuration module by using
  35. the :envvar:`CELERY_CONFIG_MODULE` environment variable.
  36. Let's create our :file:`celeryconfig.py`.
  37. 1. Configure how we communicate with the broker (RabbitMQ in this example)::
  38. BROKER_HOST = "localhost"
  39. BROKER_PORT = 5672
  40. BROKER_USER = "myuser"
  41. BROKER_PASSWORD = "mypassword"
  42. BROKER_VHOST = "myvhost"
  43. 2. Define the backend used to store task metadata and return values::
  44. CELERY_RESULT_BACKEND = "amqp"
  45. The AMQP backend is non-persistent by default, and you can only
  46. fetch the result of a task once (as it's sent as a message).
  47. For list of backends available and related options see
  48. :ref:`conf-result-backend`.
  49. 3. Finally we list the modules the worker should import. This includes
  50. the modules containing your tasks.
  51. We only have a single task module, :file:`tasks.py`, which we added earlier::
  52. CELERY_IMPORTS = ("tasks", )
  53. That's it.
  54. There are more options available, like how many processes you want to
  55. use to process work in parallel (the :setting:`CELERY_CONCURRENCY` setting),
  56. and we could use a persistent result store backend, but for now, this should
  57. do. For all of the options available, see :ref:`configuration`.
  58. .. note::
  59. You can also specify modules to import using the :option:`-I` option to
  60. :mod:`~celery.bin.celeryd`::
  61. $ celeryd -l info -I tasks,handlers
  62. This can be a single, or a comma separated list of task modules to import
  63. when :program:`celeryd` starts.
  64. .. _celerytut-running-celeryd:
  65. Running the celery worker server
  66. ================================
  67. To test we will run the worker server in the foreground, so we can
  68. see what's going on in the terminal::
  69. $ celeryd --loglevel=INFO
  70. In production you will probably want to run the worker in the
  71. background as a daemon. To do this you need to use the tools provided
  72. by your platform, or something like `supervisord`_ (see :ref:`daemonizing`
  73. for more information).
  74. For a complete listing of the command line options available, do::
  75. $ celeryd --help
  76. .. _`supervisord`: http://supervisord.org
  77. .. _celerytut-executing-task:
  78. Executing the task
  79. ==================
  80. Whenever we want to execute our task, we use the
  81. :meth:`~celery.task.base.Task.delay` method of the task class.
  82. This is a handy shortcut to the :meth:`~celery.task.base.Task.apply_async`
  83. method which gives greater control of the task execution (see
  84. :ref:`guide-executing`).
  85. >>> from tasks import add
  86. >>> add.delay(4, 4)
  87. <AsyncResult: 889143a6-39a2-4e52-837b-d80d33efb22d>
  88. At this point, the task has been sent to the message broker. The message
  89. broker will hold on to the task until a worker server has consumed and
  90. executed it.
  91. Right now we have to check the worker log files to know what happened
  92. with the task. Applying a task returns an
  93. :class:`~celery.result.AsyncResult`, if you have configured a result store
  94. the :class:`~celery.result.AsyncResult` enables you to check the state of
  95. the task, wait for the task to finish, get its return value
  96. or exception/traceback if the task failed, and more.
  97. Keeping Results
  98. ---------------
  99. If you want to keep track of the tasks state, Celery needs to store or send
  100. the states somewhere. There are several
  101. built-in backends to choose from: SQLAlchemy/Django ORM, Memcached, Redis,
  102. AMQP, MongoDB, Tokyo Tyrant and Redis -- or you can define your own.
  103. For this example we will use the `amqp` result backend, which sends states
  104. as messages. The backend is configured via the ``CELERY_RESULT_BACKEND``
  105. option, in addition individual result backends may have additional settings
  106. you can configure::
  107. CELERY_RESULT_BACKEND = "amqp"
  108. #: We want the results to expire in 5 minutes, note that this requires
  109. #: RabbitMQ version 2.1.1 or higher, so please comment out if you have
  110. #: an earlier version.
  111. CELERY_AMQP_TASK_RESULT_EXPIRES = 300
  112. To read more about result backends please see :ref:`task-result-backends`.
  113. Now with the result backend configured, let's execute the task again.
  114. This time we'll hold on to the :class:`~celery.result.AsyncResult`::
  115. >>> result = add.delay(4, 4)
  116. Here's some examples of what you can do when you have results::
  117. >>> result.ready() # returns True if the task has finished processing.
  118. False
  119. >>> result.result # task is not ready, so no return value yet.
  120. None
  121. >>> result.get() # Waits until the task is done and returns the retval.
  122. 8
  123. >>> result.result # direct access to result, doesn't re-raise errors.
  124. 8
  125. >>> result.successful() # returns True if the task didn't end in failure.
  126. True
  127. If the task raises an exception, the return value of `result.successful()`
  128. will be :const:`False`, and `result.result` will contain the exception instance
  129. raised by the task.
  130. Where to go from here
  131. =====================
  132. After this you should read the :ref:`guide`. Specifically
  133. :ref:`guide-tasks` and :ref:`guide-executing`.