introduction.txt 5.1 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162
  1. :Version: 1.0.0-pre1
  2. :Keywords: task queue, job queue, asynchronous, rabbitmq, amqp, redis,
  3. django, python, webhooks, queue, distributed
  4. --
  5. Celery is a task queue/job queue based on distributed message passing.
  6. It is focused on real-time operation, but has support for scheduling as well.
  7. The execution units, called tasks, are executed concurrently on one or more
  8. worker servers, asynchronously (in the background) or synchronously
  9. (wait until ready).
  10. Celery is already used in production to process millions of tasks a day.
  11. It was first created for Django, but is now usable from Python as well.
  12. It can also `operate with other languages via HTTP+JSON`_.
  13. .. _`operate with other languages via HTTP+JSON`: http://bit.ly/CgXSc
  14. Overview
  15. ========
  16. This is a high level overview of the architecture.
  17. .. image:: http://cloud.github.com/downloads/ask/celery/Celery-Overview-v4.jpg
  18. The broker pushes tasks to the worker servers.
  19. A worker server is a networked machine running ``celeryd``. This can be one or
  20. more machines, depending on the workload.
  21. The result of the task can be stored for later retrieval (called its
  22. "tombstone").
  23. Example
  24. =======
  25. You probably want to see some code by now, so I'll give you an example task
  26. adding two numbers:
  27. .. code-block:: python
  28. from celery.decorators import task
  29. @task
  30. def add(x, y):
  31. return x + y
  32. You can execute the task in the background, or wait for it to finish::
  33. >>> result = add.delay(4, 4)
  34. >>> result.wait() # wait for and return the result
  35. 8
  36. Simple!
  37. Features
  38. ========
  39. * Supports using `RabbitMQ`_, `AMQP`_, `Stomp`_, `Redis`_ or a database
  40. as the message queue. However, `RabbitMQ`_ is the recommended solution,
  41. so most of the documentation refers to it.
  42. * Using RabbitMQ, celery is *very robust*. It should survive most
  43. scenarios, and your tasks will never be lost.
  44. * Tasks are executed *concurrently* using the Python 2.6
  45. :mod:`multiprocessing` module (also available as a back-port
  46. to older python versions)
  47. * Supports *periodic tasks*, which makes it a (better) replacement
  48. for cronjobs.
  49. * When a task has been executed, the return value can be stored using
  50. either a MySQL/Oracle/PostgreSQL/SQLite database, Memcached,
  51. `MongoDB`_, `Redis`_ or `Tokyo Tyrant`_ back-end. For high-performance
  52. you can also use AMQP messages to publish results.
  53. * Supports calling tasks over HTTP to support multiple programming
  54. languages and systems.
  55. * Supports several serialization schemes, like pickle, json, yaml and
  56. supports registering custom encodings .
  57. * If the task raises an exception, the exception instance is stored,
  58. instead of the return value, and it's possible to inspect the traceback
  59. after the fact.
  60. * All tasks has a Universally Unique Identifier (UUID), which is the
  61. task id, used for querying task status and return values.
  62. * Tasks can be retried if they fail, with a configurable maximum number
  63. of retries.
  64. * Tasks can be configured to run at a specific time and date in the
  65. future (ETA) or you can set a countdown in seconds for when the
  66. task should be executed.
  67. * Supports *task-sets*, which is a task consisting of several sub-tasks.
  68. You can find out how many, or if all of the sub-tasks has been executed.
  69. Excellent for progress-bar like functionality.
  70. * However, you rarely want to wait for these results in a web-environment.
  71. You'd rather want to use Ajax to poll the task status, which is
  72. available from a URL like ``celery/<task_id>/status/``. This view
  73. returns a JSON-serialized data structure containing the task status,
  74. and the return value if completed, or exception on failure.
  75. * Pool workers are supervised, so if for some reason a worker crashes
  76. it is automatically replaced by a new worker.
  77. * Can be configured to send e-mails to the administrators when a task
  78. fails.
  79. .. _`RabbitMQ`: http://www.rabbitmq.com/
  80. .. _`AMQP`: http://www.amqp.org/
  81. .. _`Stomp`: http://stomp.codehaus.org/
  82. .. _`MongoDB`: http://www.mongodb.org/
  83. .. _`Redis`: http://code.google.com/p/redis/
  84. .. _`Tokyo Tyrant`: http://tokyocabinet.sourceforge.net/
  85. Documentation
  86. =============
  87. The `latest documentation`_ with user guides, tutorials and API reference
  88. is hosted at Github.
  89. .. _`latest documentation`: http://ask.github.com/celery/
  90. Installation
  91. =============
  92. You can install ``celery`` either via the Python Package Index (PyPI)
  93. or from source.
  94. To install using ``pip``,::
  95. $ pip install celery
  96. To install using ``easy_install``,::
  97. $ easy_install celery
  98. Downloading and installing from source
  99. --------------------------------------
  100. Download the latest version of ``celery`` from
  101. http://pypi.python.org/pypi/celery/
  102. You can install it by doing the following,::
  103. $ tar xvfz celery-0.0.0.tar.gz
  104. $ cd celery-0.0.0
  105. $ python setup.py build
  106. # python setup.py install # as root
  107. Using the development version
  108. ------------------------------
  109. You can clone the repository by doing the following::
  110. $ git clone git://github.com/ask/celery.git