first-steps-with-celery.rst 5.1 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165
  1. ========================
  2. First steps with Celery
  3. ========================
  4. .. contents::
  5. :local:
  6. Creating a simple task
  7. ======================
  8. In this example we are creating a simple task that adds two
  9. numbers. Tasks are defined in a normal python module. The module can
  10. be named whatever you like, but the convention is to call it
  11. ``tasks.py``.
  12. Our addition task looks like this:
  13. ``tasks.py``:
  14. .. code-block:: python
  15. from celery.decorators import task
  16. @task
  17. def add(x, y):
  18. return x + y
  19. All celery tasks are classes that inherit from the ``Task``
  20. class. In this case we're using a decorator that wraps the add
  21. function in an appropriate class for us automatically. The full
  22. documentation on how to create tasks and task classes is in the
  23. :doc:`../userguide/tasks` part of the user guide.
  24. Configuration
  25. =============
  26. Celery is configured by using a configuration module. By default
  27. this module is called ``celeryconfig.py``.
  28. :Note: This configuration module must be on the Python path so it
  29. can be imported.
  30. You can set a custom name for the configuration module with the
  31. ``CELERY_CONFIG_MODULE`` variable, but in these examples we use the
  32. default name.
  33. Let's create our ``celeryconfig.py``.
  34. 1. Configure how we communicate with the broker::
  35. BROKER_HOST = "localhost"
  36. BROKER_PORT = 5672
  37. BROKER_USER = "myuser"
  38. BROKER_PASSWORD = "mypassword"
  39. BROKER_VHOST = "myvhost"
  40. 2. In this example we don't want to store the results of the tasks, so
  41. we'll use the simplest backend available; the AMQP backend::
  42. CELERY_RESULT_BACKEND = "amqp"
  43. The AMQP backend is non-persistent by default, and you can only
  44. fetch the result of a task once (as it's sent as a message).
  45. 3. Finally, we list the modules to import, that is, all the modules
  46. that contain tasks. This is so Celery knows about what tasks it can
  47. be asked to perform.
  48. We only have a single task module, ``tasks.py``, which we added earlier::
  49. import os
  50. import sys
  51. sys.path.insert(0, os.getcwd())
  52. CELERY_IMPORTS = ("tasks", )
  53. That's it.
  54. There are more options available, like how many processes you want to
  55. process work in parallel (the ``CELERY_CONCURRENCY`` setting), and we
  56. could use a persistent result store backend, but for now, this should
  57. do. For all of the options available, see the
  58. :doc:`configuration directive reference<../configuration>`.
  59. Running the celery worker server
  60. ================================
  61. To test we will run the worker server in the foreground, so we can
  62. see what's going on in the terminal::
  63. $ celeryd --loglevel=INFO
  64. However, in production you probably want to run the worker in the
  65. background as a daemon. To do this you need to use to tools provided
  66. by your platform, or something like `supervisord`_.
  67. For a complete listing of the command line options available, use the
  68. help command::
  69. $ celeryd --help
  70. For info on how to run celery as standalone daemon, see
  71. :doc:`daemon mode reference<../cookbook/daemonizing>`
  72. .. _`supervisord`: http://supervisord.org
  73. Executing the task
  74. ==================
  75. Whenever we want to execute our task, we can use the
  76. :meth:`~celery.task.base.Task.delay` method of the task class.
  77. This is a handy shortcut to the :meth:`~celery.task.base.Task.apply_async`
  78. method which gives greater control of the task execution. Read the
  79. :doc:`Executing Tasks<../userguide/executing>` part of the user guide
  80. for more information about executing tasks.
  81. >>> from tasks import add
  82. >>> add.delay(4, 4)
  83. <AsyncResult: 889143a6-39a2-4e52-837b-d80d33efb22d>
  84. At this point, the task has been sent to the message broker. The message
  85. broker will hold on to the task until a worker server has successfully
  86. picked it up.
  87. *Note:* If everything is just hanging when you execute ``delay``, please check
  88. that RabbitMQ is running, and that the user/password has access to the virtual
  89. host you configured earlier.
  90. Right now we have to check the worker log files to know what happened
  91. with the task. This is because we didn't keep the :class:`~celery.result.AsyncResult`
  92. object returned by :meth:`~celery.task.base.Task.delay`.
  93. The :class:`~celery.result.AsyncResult` lets us find the state of the task, wait for
  94. the task to finish, get its return value (or exception if the task failed),
  95. and more.
  96. So, let's execute the task again, but this time we'll keep track of the task
  97. by keeping the :class:`~celery.result.AsyncResult`::
  98. >>> result = add.delay(4, 4)
  99. >>> result.ready() # returns True if the task has finished processing.
  100. False
  101. >>> result.result # task is not ready, so no return value yet.
  102. None
  103. >>> result.get() # Waits until the task is done and returns the retval.
  104. 8
  105. >>> result.result # direct access to result, doesn't re-raise errors.
  106. 8
  107. >>> result.successful() # returns True if the task didn't end in failure.
  108. True
  109. If the task raises an exception, the return value of ``result.successful()``
  110. will be ``False``, and ``result.result`` will contain the exception instance
  111. raised by the task.
  112. That's all for now! After this you should probably read the :doc:`User
  113. Guide<../userguide/index>`.