README.rst 13 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423
  1. =================================
  2. celery - Distributed Task Queue
  3. =================================
  4. :Version: 0.7.0
  5. Introduction
  6. ============
  7. Celery is a distributed task queue.
  8. It was first created for Django, but is now usable from Python.
  9. It can also operate with other languages via HTTP+JSON.
  10. This introduction is written for someone who wants to use
  11. Celery from within a Django project. For information about using it from
  12. pure Python see `Can I use Celery without Django?`_, for calling out to other
  13. languages see `Executing tasks on a remote web server`_.
  14. .. _`Can I use Celery without Django?`: http://bit.ly/WPa6n
  15. .. _`Executing tasks on a remote web server`: http://bit.ly/CgXSc
  16. It is used for executing tasks *asynchronously*, routed to one or more
  17. worker servers, running concurrently using multiprocessing.
  18. It is designed to solve certain problems related to running websites
  19. demanding high-availability and performance.
  20. It is perfect for filling caches, posting updates to twitter, mass
  21. downloading data like syndication feeds or web scraping. Use-cases are
  22. plentiful. Implementing these features asynchronously using ``celery`` is
  23. easy and fun, and the performance improvements can make it more than
  24. worthwhile.
  25. Overview
  26. ========
  27. This is a high level overview of the architecture.
  28. .. image:: http://cloud.github.com/downloads/ask/celery/Celery-Overview-v4.jpg
  29. The broker is an AMQP server pushing tasks to the worker servers.
  30. A worker server is a networked machine running ``celeryd``. This can be one or
  31. more machines, depending on the workload. See `A look inside the worker`_ to
  32. see how the worker server works.
  33. The result of the task can be stored for later retrieval (called its
  34. "tombstone").
  35. Features
  36. ========
  37. * Uses AMQP messaging (RabbitMQ, ZeroMQ, Qpid) to route tasks to the
  38. worker servers. Experimental support for STOMP (ActiveMQ) is also
  39. available.
  40. * You can run as many worker servers as you want, and still
  41. be *guaranteed that the task is only executed once.*
  42. * Tasks are executed *concurrently* using the Python 2.6
  43. ``multiprocessing`` module (also available as a back-port
  44. to older python versions)
  45. * Supports *periodic tasks*, which makes it a (better) replacement
  46. for cronjobs.
  47. * When a task has been executed, the return value can be stored using
  48. either a MySQL/Oracle/PostgreSQL/SQLite database, Memcached,
  49. `MongoDB`_ or `Tokyo Tyrant`_ back-end. For high-performance you can
  50. also use AMQP messages to publish results.
  51. * If the task raises an exception, the exception instance is stored,
  52. instead of the return value.
  53. * All tasks has a Universally Unique Identifier (UUID), which is the
  54. task id, used for querying task status and return values.
  55. * Tasks can be retried if they fail, with a configurable maximum number
  56. of retries.
  57. * Tasks can be configured to run at a specific time and date in the
  58. future (ETA) or you can set a countdown in seconds for when the
  59. task should be executed.
  60. * Supports *task-sets*, which is a task consisting of several sub-tasks.
  61. You can find out how many, or if all of the sub-tasks has been executed.
  62. Excellent for progress-bar like functionality.
  63. * Has a ``map`` like function that uses tasks, called ``dmap``.
  64. * However, you rarely want to wait for these results in a web-environment.
  65. You'd rather want to use Ajax to poll the task status, which is
  66. available from a URL like ``celery/<task_id>/status/``. This view
  67. returns a JSON-serialized data structure containing the task status,
  68. and the return value if completed, or exception on failure.
  69. * The worker can collect statistics, like, how many tasks has been
  70. executed by type, and the time it took to process them. Very useful
  71. for monitoring and profiling.
  72. * Pool workers are supervised, so if for some reason a worker crashes
  73. it is automatically replaced by a new worker.
  74. * Can be configured to send e-mails to the administrators when a task
  75. fails.
  76. .. _`MongoDB`: http://www.mongodb.org/
  77. .. _`Tokyo Tyrant`: http://tokyocabinet.sourceforge.net/
  78. API Reference Documentation
  79. ===========================
  80. The `API Reference`_ is hosted at Github
  81. (http://ask.github.com/celery)
  82. .. _`API Reference`: http://ask.github.com/celery/
  83. Installation
  84. =============
  85. You can install ``celery`` either via the Python Package Index (PyPI)
  86. or from source.
  87. To install using ``pip``,::
  88. $ pip install celery
  89. To install using ``easy_install``,::
  90. $ easy_install celery
  91. Downloading and installing from source
  92. --------------------------------------
  93. Download the latest version of ``celery`` from
  94. http://pypi.python.org/pypi/celery/
  95. You can install it by doing the following,::
  96. $ tar xvfz celery-0.0.0.tar.gz
  97. $ cd celery-0.0.0
  98. $ python setup.py build
  99. # python setup.py install # as root
  100. Using the development version
  101. ------------------------------
  102. You can clone the repository by doing the following::
  103. $ git clone git://github.com/ask/celery.git
  104. Usage
  105. =====
  106. Installing RabbitMQ
  107. -------------------
  108. See `Installing RabbitMQ`_ over at RabbitMQ's website. For Mac OS X
  109. see `Installing RabbitMQ on OS X`_.
  110. .. _`Installing RabbitMQ`: http://www.rabbitmq.com/install.html
  111. .. _`Installing RabbitMQ on OS X`:
  112. http://playtype.net/past/2008/10/9/installing_rabbitmq_on_osx/
  113. Setting up RabbitMQ
  114. -------------------
  115. To use celery we need to create a RabbitMQ user, a virtual host and
  116. allow that user access to that virtual host::
  117. $ rabbitmqctl add_user myuser mypassword
  118. $ rabbitmqctl add_vhost myvhost
  119. From RabbitMQ version 1.6.0 and onward you have to use the new ACL features
  120. to allow access::
  121. $ rabbitmqctl set_permissions -p myvhost myuser "" ".*" ".*"
  122. See the RabbitMQ `Admin Guide`_ for more information about `access control`_.
  123. .. _`Admin Guide`: http://www.rabbitmq.com/admin-guide.html
  124. .. _`access control`: http://www.rabbitmq.com/admin-guide.html#access-control
  125. If you are still using version 1.5.0 or below, please use ``map_user_vhost``::
  126. $ rabbitmqctl map_user_vhost myuser myvhost
  127. Configuring your Django project to use Celery
  128. ---------------------------------------------
  129. You only need three simple steps to use celery with your Django project.
  130. 1. Add ``celery`` to ``INSTALLED_APPS``.
  131. 2. Create the celery database tables::
  132. $ python manage.py syncdb
  133. 3. Configure celery to use the AMQP user and virtual host we created
  134. before, by adding the following to your ``settings.py``::
  135. AMQP_SERVER = "localhost"
  136. AMQP_PORT = 5672
  137. AMQP_USER = "myuser"
  138. AMQP_PASSWORD = "mypassword"
  139. AMQP_VHOST = "myvhost"
  140. That's it.
  141. There are more options available, like how many processes you want to process
  142. work in parallel (the ``CELERY_CONCURRENCY`` setting), and the backend used
  143. for storing task statuses. But for now, this should do. For all of the options
  144. available, please consult the `API Reference`_
  145. **Note**: If you're using SQLite as the Django database back-end,
  146. ``celeryd`` will only be able to process one task at a time, this is
  147. because SQLite doesn't allow concurrent writes.
  148. Running the celery worker server
  149. --------------------------------
  150. To test this we'll be running the worker server in the foreground, so we can
  151. see what's going on without consulting the logfile::
  152. $ python manage.py celeryd
  153. However, in production you probably want to run the worker in the
  154. background, as a daemon::
  155. $ python manage.py celeryd --detach
  156. For a complete listing of the command line arguments available, with a short
  157. description, you can use the help command::
  158. $ python manage.py help celeryd
  159. Defining and executing tasks
  160. ----------------------------
  161. **Please note** All of these tasks has to be stored in a real module, they can't
  162. be defined in the python shell or ipython/bpython. This is because the celery
  163. worker server needs access to the task function to be able to run it.
  164. So while it looks like we use the python shell to define the tasks in these
  165. examples, you can't do it this way. Put them in the ``tasks`` module of your
  166. Django application. The worker server will automatically load any ``tasks.py``
  167. file for all of the applications listed in ``settings.INSTALLED_APPS``.
  168. Executing tasks using ``delay`` and ``apply_async`` can be done from the
  169. python shell, but keep in mind that since arguments are pickled, you can't
  170. use custom classes defined in the shell session.
  171. While you can use regular functions, the recommended way is to define
  172. a task class. This way you can cleanly upgrade the task to use the more
  173. advanced features of celery later.
  174. This is a task that basically does nothing but take some arguments,
  175. and return a value:
  176. >>> from celery.task import Task
  177. >>> from celery.registry import tasks
  178. >>> class MyTask(Task):
  179. ... def run(self, some_arg, **kwargs):
  180. ... logger = self.get_logger(**kwargs)
  181. ... logger.info("Did something: %s" % some_arg)
  182. ... return 42
  183. >>> tasks.register(MyTask)
  184. As you can see the worker is sending some keyword arguments to this task,
  185. this is the default keyword arguments. A task can choose not to take these,
  186. or only list the ones it want (the worker will do the right thing).
  187. The current default keyword arguments are:
  188. * logfile
  189. The currently used log file, can be passed on to ``self.get_logger``
  190. to gain access to the workers log file via a ``logger.Logging``
  191. instance.
  192. * loglevel
  193. The current loglevel used.
  194. * task_id
  195. The unique id of the executing task.
  196. * task_name
  197. Name of the executing task.
  198. * task_retries
  199. How many times the current task has been retried.
  200. (an integer starting a ``0``).
  201. Now if we want to execute this task, we can use the ``delay`` method of the
  202. task class (this is a handy shortcut to the ``apply_async`` method which gives
  203. you greater control of the task execution).
  204. >>> from myapp.tasks import MyTask
  205. >>> MyTask.delay(some_arg="foo")
  206. At this point, the task has been sent to the message broker. The message
  207. broker will hold on to the task until a celery worker server has successfully
  208. picked it up.
  209. *Note* If everything is just hanging when you execute ``delay``, please check
  210. that RabbitMQ is running, and that the user/password has access to the virtual
  211. host you configured earlier.
  212. Right now we have to check the celery worker logfiles to know what happened with
  213. the task. This is because we didn't keep the ``AsyncResult`` object returned
  214. by ``delay``.
  215. The ``AsyncResult`` lets us find the state of the task, wait for the task to
  216. finish and get its return value (or exception if the task failed).
  217. So, let's execute the task again, but this time we'll keep track of the task:
  218. >>> result = MyTask.delay("do_something", some_arg="foo bar baz")
  219. >>> result.ready() # returns True if the task has finished processing.
  220. False
  221. >>> result.result # task is not ready, so no return value yet.
  222. None
  223. >>> result.get() # Waits until the task is done and return the retval.
  224. 42
  225. >>> result.result
  226. 42
  227. >>> result.successful() # returns True if the task didn't end in failure.
  228. True
  229. If the task raises an exception, the ``result.success()`` will be ``False``,
  230. and ``result.result`` will contain the exception instance raised.
  231. Auto-discovery of tasks
  232. -----------------------
  233. ``celery`` has an auto-discovery feature like the Django Admin, that
  234. automatically loads any ``tasks.py`` module in the applications listed
  235. in ``settings.INSTALLED_APPS``. This autodiscovery is used by the celery
  236. worker to find registered tasks for your Django project.
  237. Periodic Tasks
  238. ---------------
  239. Periodic tasks are tasks that are run every ``n`` seconds.
  240. Here's an example of a periodic task:
  241. >>> from celery.task import PeriodicTask
  242. >>> from celery.registry import tasks
  243. >>> from datetime import timedelta
  244. >>> class MyPeriodicTask(PeriodicTask):
  245. ... run_every = timedelta(seconds=30)
  246. ...
  247. ... def run(self, **kwargs):
  248. ... logger = self.get_logger(**kwargs)
  249. ... logger.info("Running periodic task!")
  250. ...
  251. >>> tasks.register(MyPeriodicTask)
  252. **Note:** Periodic tasks does not support arguments, as this doesn't
  253. really make sense.
  254. A look inside the worker
  255. ========================
  256. .. image:: http://cloud.github.com/downloads/ask/celery/InsideTheWorker-v2.jpg
  257. Getting Help
  258. ============
  259. Mailing list
  260. ------------
  261. For discussions about the usage, development, and future of celery,
  262. please join the `celery-users`_ mailing list.
  263. .. _`celery-users`: http://groups.google.com/group/celery-users/
  264. IRC
  265. ---
  266. Come chat with us on IRC. The `#celery`_ channel is located at the `Freenode`_
  267. network.
  268. .. _`#celery`: irc://irc.freenode.net/celery
  269. .. _`Freenode`: http://freenode.net
  270. Bug tracker
  271. ===========
  272. If you have any suggestions, bug reports or annoyances please report them
  273. to our issue tracker at http://github.com/ask/celery/issues/
  274. Contributing
  275. ============
  276. Development of ``celery`` happens at Github: http://github.com/ask/celery
  277. You are highly encouraged to participate in the development
  278. of ``celery``. If you don't like Github (for some reason) you're welcome
  279. to send regular patches.
  280. License
  281. =======
  282. This software is licensed under the ``New BSD License``. See the ``LICENSE``
  283. file in the top distribution directory for the full license text.
  284. .. # vim: syntax=rst expandtab tabstop=4 shiftwidth=4 shiftround