README.rst 13 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405
  1. =================================
  2. celery - Distributed Task Queue
  3. =================================
  4. :Version: 1.0.0-pre1
  5. Introduction
  6. ============
  7. Keywords: task queue, job queue, asynchronous, rabbitmq, amqp, redis.
  8. Celery is a task queue/job queue based on distributed message passing.
  9. It is focused on real-time operation, but has support for scheduling as well.
  10. The execution units, called tasks, are executed concurrently on one or more
  11. worker servers, asynchronously (in the background) or synchronously
  12. (wait until ready).
  13. Celery is already used in production to process millions of tasks a day.
  14. It was first created for Django, but is now usable from Python.
  15. It can also operate with other languages via HTTP+JSON.
  16. This introduction is written for someone who wants to use
  17. Celery from within a Django project. For information about using it from
  18. pure Python see `Can I use Celery without Django?`_, for calling out to other
  19. languages see `Executing tasks on a remote web server`_.
  20. .. _`Can I use Celery without Django?`: http://bit.ly/WPa6n
  21. .. _`Executing tasks on a remote web server`: http://bit.ly/CgXSc
  22. Overview
  23. ========
  24. This is a high level overview of the architecture.
  25. .. image:: http://cloud.github.com/downloads/ask/celery/Celery-Overview-v4.jpg
  26. The broker pushes tasks to the worker servers.
  27. A worker server is a networked machine running ``celeryd``. This can be one or
  28. more machines, depending on the workload.
  29. The result of the task can be stored for later retrieval (called its
  30. "tombstone").
  31. Features
  32. ========
  33. * Uses messaging (AMQP: RabbitMQ, ZeroMQ, Qpid) to route tasks to the
  34. worker servers. Experimental support for STOMP (ActiveMQ) is also
  35. available. For simple setups it's also possible to use Redis or an
  36. SQL database as the message queue.
  37. * You can run as many worker servers as you want, and still
  38. be *guaranteed that the task is only executed once.*
  39. * Tasks are executed *concurrently* using the Python 2.6
  40. ``multiprocessing`` module (also available as a back-port
  41. to older python versions)
  42. * Supports *periodic tasks*, which makes it a (better) replacement
  43. for cronjobs.
  44. * When a task has been executed, the return value can be stored using
  45. either a MySQL/Oracle/PostgreSQL/SQLite database, Memcached,
  46. `MongoDB`_, `Redis`_ or `Tokyo Tyrant`_ back-end. For high-performance
  47. you can also use AMQP messages to publish results.
  48. * Supports calling tasks over HTTP to support multiple programming
  49. languages and systems.
  50. * Supports several serialization schemes, like pickle, json, yaml and
  51. supports registering custom encodings .
  52. * If the task raises an exception, the exception instance is stored,
  53. instead of the return value, and it's possible to inspect the traceback
  54. after the fact.
  55. * All tasks has a Universally Unique Identifier (UUID), which is the
  56. task id, used for querying task status and return values.
  57. * Tasks can be retried if they fail, with a configurable maximum number
  58. of retries.
  59. * Tasks can be configured to run at a specific time and date in the
  60. future (ETA) or you can set a countdown in seconds for when the
  61. task should be executed.
  62. * Supports *task-sets*, which is a task consisting of several sub-tasks.
  63. You can find out how many, or if all of the sub-tasks has been executed.
  64. Excellent for progress-bar like functionality.
  65. * However, you rarely want to wait for these results in a web-environment.
  66. You'd rather want to use Ajax to poll the task status, which is
  67. available from a URL like ``celery/<task_id>/status/``. This view
  68. returns a JSON-serialized data structure containing the task status,
  69. and the return value if completed, or exception on failure.
  70. * Pool workers are supervised, so if for some reason a worker crashes
  71. it is automatically replaced by a new worker.
  72. * Can be configured to send e-mails to the administrators when a task
  73. fails.
  74. .. _`MongoDB`: http://www.mongodb.org/
  75. .. _`Redis`: http://code.google.com/p/redis/
  76. .. _`Tokyo Tyrant`: http://tokyocabinet.sourceforge.net/
  77. API Reference Documentation
  78. ===========================
  79. The `API Reference`_ is hosted at Github
  80. (http://ask.github.com/celery)
  81. .. _`API Reference`: http://ask.github.com/celery/
  82. Installation
  83. =============
  84. You can install ``celery`` either via the Python Package Index (PyPI)
  85. or from source.
  86. To install using ``pip``,::
  87. $ pip install celery
  88. To install using ``easy_install``,::
  89. $ easy_install celery
  90. Downloading and installing from source
  91. --------------------------------------
  92. Download the latest version of ``celery`` from
  93. http://pypi.python.org/pypi/celery/
  94. You can install it by doing the following,::
  95. $ tar xvfz celery-0.0.0.tar.gz
  96. $ cd celery-0.0.0
  97. $ python setup.py build
  98. # python setup.py install # as root
  99. Using the development version
  100. ------------------------------
  101. You can clone the repository by doing the following::
  102. $ git clone git://github.com/ask/celery.git
  103. Usage
  104. =====
  105. Installing RabbitMQ
  106. -------------------
  107. See `Installing RabbitMQ`_ over at RabbitMQ's website. For Mac OS X
  108. see `Installing RabbitMQ on OS X`_.
  109. .. _`Installing RabbitMQ`: http://www.rabbitmq.com/install.html
  110. .. _`Installing RabbitMQ on OS X`:
  111. http://playtype.net/past/2008/10/9/installing_rabbitmq_on_osx/
  112. Setting up RabbitMQ
  113. -------------------
  114. To use celery we need to create a RabbitMQ user, a virtual host and
  115. allow that user access to that virtual host::
  116. $ rabbitmqctl add_user myuser mypassword
  117. $ rabbitmqctl add_vhost myvhost
  118. $ rabbitmqctl set_permissions -p myvhost myuser "" ".*" ".*"
  119. See the RabbitMQ `Admin Guide`_ for more information about `access control`_.
  120. .. _`Admin Guide`: http://www.rabbitmq.com/admin-guide.html
  121. .. _`access control`: http://www.rabbitmq.com/admin-guide.html#access-control
  122. Configuring your Django project to use Celery
  123. ---------------------------------------------
  124. You only need three simple steps to use celery with your Django project.
  125. 1. Add ``celery`` to ``INSTALLED_APPS``.
  126. 2. Create the celery database tables::
  127. $ python manage.py syncdb
  128. 3. Configure celery to use the AMQP user and virtual host we created
  129. before, by adding the following to your ``settings.py``::
  130. BROKER_HOST = "localhost"
  131. BROKER_PORT = 5672
  132. BROKER_USER = "myuser"
  133. BROKER_PASSWORD = "mypassword"
  134. BROKER_VHOST = "myvhost"
  135. That's it.
  136. There are more options available, like how many processes you want to process
  137. work in parallel (the ``CELERY_CONCURRENCY`` setting), and the backend used
  138. for storing task statuses. But for now, this should do. For all of the options
  139. available, please consult the `API Reference`_
  140. **Note**: If you're using SQLite as the Django database back-end,
  141. ``celeryd`` will only be able to process one task at a time, this is
  142. because SQLite doesn't allow concurrent writes.
  143. Running the celery worker server
  144. --------------------------------
  145. To test this we'll be running the worker server in the foreground, so we can
  146. see what's going on without consulting the logfile::
  147. $ python manage.py celeryd
  148. However, in production you probably want to run the worker in the
  149. background as a daemon. To do this you need to use to tools provided by your
  150. platform, or something like `supervisord`_.
  151. For example startup scripts see ``contrib/debian/init.d`` for using
  152. ``start-stop-daemon`` on Debian/Ubuntu, or ``contrib/mac/org.celeryq.*`` for using
  153. ``launchd`` on Mac OS X.
  154. .. _`supervisord`: http://supervisord.org/
  155. For a complete listing of the command line arguments available, with a short
  156. description, you can use the help command::
  157. $ python manage.py help celeryd
  158. Defining and executing tasks
  159. ----------------------------
  160. **Please note** All of these tasks has to be stored in a real module, they can't
  161. be defined in the python shell or ipython/bpython. This is because the celery
  162. worker server needs access to the task function to be able to run it.
  163. Put them in the ``tasks`` module of your
  164. Django application. The worker server will automatically load any ``tasks.py``
  165. file for all of the applications listed in ``settings.INSTALLED_APPS``.
  166. Executing tasks using ``delay`` and ``apply_async`` can be done from the
  167. python shell, but keep in mind that since arguments are pickled, you can't
  168. use custom classes defined in the shell session.
  169. This is a task that adds two numbers:
  170. ::
  171. from celery.decorators import task
  172. @task()
  173. def add(x, y):
  174. return x + y
  175. Now if we want to execute this task, we can use the
  176. ``delay`` method of the task class.
  177. This is a handy shortcut to the ``apply_async`` method which gives
  178. greater control of the task execution (see ``userguide/executing`` for more
  179. information).
  180. >>> from myapp.tasks import MyTask
  181. >>> MyTask.delay(some_arg="foo")
  182. At this point, the task has been sent to the message broker. The message
  183. broker will hold on to the task until a celery worker server has successfully
  184. picked it up.
  185. *Note* If everything is just hanging when you execute ``delay``, please check
  186. that RabbitMQ is running, and that the user/password has access to the virtual
  187. host you configured earlier.
  188. Right now we have to check the celery worker logfiles to know what happened
  189. with the task. This is because we didn't keep the ``AsyncResult`` object
  190. returned by ``delay``.
  191. The ``AsyncResult`` lets us find the state of the task, wait for the task to
  192. finish and get its return value (or exception if the task failed).
  193. So, let's execute the task again, but this time we'll keep track of the task:
  194. >>> result = add.delay(4, 4)
  195. >>> result.ready() # returns True if the task has finished processing.
  196. False
  197. >>> result.result # task is not ready, so no return value yet.
  198. None
  199. >>> result.get() # Waits until the task is done and returns the retval.
  200. 8
  201. >>> result.result # direct access to result, doesn't re-raise errors.
  202. 8
  203. >>> result.successful() # returns True if the task didn't end in failure.
  204. True
  205. If the task raises an exception, the return value of ``result.successful()``
  206. will be ``False``, and ``result.result`` will contain the exception instance
  207. raised by the task.
  208. Worker auto-discovery of tasks
  209. ------------------------------
  210. ``celeryd`` has an auto-discovery feature like the Django Admin, that
  211. automatically loads any ``tasks.py`` module in the applications listed
  212. in ``settings.INSTALLED_APPS``. This autodiscovery is used by the celery
  213. worker to find registered tasks for your Django project.
  214. Periodic Tasks
  215. ---------------
  216. Periodic tasks are tasks that are run every ``n`` seconds.
  217. Here's an example of a periodic task:
  218. ::
  219. from celery.task import PeriodicTask
  220. from celery.registry import tasks
  221. from datetime import timedelta
  222. class MyPeriodicTask(PeriodicTask):
  223. run_every = timedelta(seconds=30)
  224. def run(self, **kwargs):
  225. logger = self.get_logger(**kwargs)
  226. logger.info("Running periodic task!")
  227. >>> tasks.register(MyPeriodicTask)
  228. If you want to use periodic tasks you need to start the ``celerybeat``
  229. service. You have to make sure only one instance of this server is running at
  230. any time, or else you will end up with multiple executions of the same task.
  231. To start the ``celerybeat`` service::
  232. $ celerybeat
  233. or if using Django::
  234. $ python manage.py celerybeat
  235. You can also start ``celerybeat`` with ``celeryd`` by using the ``-B`` option,
  236. this is convenient if you only have one server::
  237. $ celeryd -B
  238. or if using Django::
  239. $ python manage.py celeryd -B
  240. A look inside the components
  241. ============================
  242. .. image:: http://cloud.github.com/downloads/ask/celery/Celery1.0-inside-worker.jpg
  243. Getting Help
  244. ============
  245. Mailing list
  246. ------------
  247. For discussions about the usage, development, and future of celery,
  248. please join the `celery-users`_ mailing list.
  249. .. _`celery-users`: http://groups.google.com/group/celery-users/
  250. IRC
  251. ---
  252. Come chat with us on IRC. The `#celery`_ channel is located at the `Freenode`_
  253. network.
  254. .. _`#celery`: irc://irc.freenode.net/celery
  255. .. _`Freenode`: http://freenode.net
  256. Bug tracker
  257. ===========
  258. If you have any suggestions, bug reports or annoyances please report them
  259. to our issue tracker at http://github.com/ask/celery/issues/
  260. Contributing
  261. ============
  262. Development of ``celery`` happens at Github: http://github.com/ask/celery
  263. You are highly encouraged to participate in the development
  264. of ``celery``. If you don't like Github (for some reason) you're welcome
  265. to send regular patches.
  266. License
  267. =======
  268. This software is licensed under the ``New BSD License``. See the ``LICENSE``
  269. file in the top distribution directory for the full license text.
  270. .. # vim: syntax=rst expandtab tabstop=4 shiftwidth=4 shiftround