| 123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270 | ============================================celery - Distributed Task Queue for Django.============================================:Version: 0.3.0Introduction============``celery`` is a distributed task queue framework for Django.It is used for executing tasks *asynchronously*, routed to one or moreworker servers, running concurrently using multiprocessing.It is designed to solve certain problems related to running websitesdemanding high-availability and performance.It is perfect for filling caches, posting updates to twitter, massdownloading data like syndication feeds or web scraping. Use-cases areplentiful. Implementing these features asynchronously using ``celery`` iseasy and fun, and the performance improvements can make it more thanworthwhile.Features========    * Uses AMQP messaging (RabbitMQ, ZeroMQ) to route tasks to the      worker servers.    * You can run as many worker servers as you want, and still      be *guaranteed that the task is only executed once.*    * Tasks are executed *concurrently* using the Python 2.6      ``multiprocessing`` module (also available as a back-port      to older python versions)    * Supports *periodic tasks*, which makes it a (better) replacement      for cronjobs.    * When a task has been executed, the return value is stored using either      a MySQL/Oracle/PostgreSQL/SQLite database, memcached,      or Tokyo Tyrant back-end.    * If the task raises an exception, the exception instance is stored,      instead of the return value.    * All tasks has a Universally Unique Identifier (UUID), which is the      task id, used for querying task status and return values.    * Supports *task-sets*, which is a task consisting of several sub-tasks.      You can find out how many, or if all of the sub-tasks has been executed.      Excellent for progress-bar like functionality.    * Has a ``map`` like function that uses tasks, called ``dmap``.    * However, you rarely want to wait for these results in a web-environment.      You'd rather want to use Ajax to poll the task status, which is      available from a URL like ``celery/<task_id>/status/``. This view      returns a JSON-serialized data structure containing the task status,      and the return value if completed, or exception on failure.      API Reference Documentation===========================The `API Reference`_ is hosted at Github(http://ask.github.com/celery).. _`API Reference`: http://ask.github.com/celery/Installation=============You can install ``celery`` either via the Python Package Index (PyPI)or from source.To install using ``pip``,::    $ pip install celeryTo install using ``easy_install``,::    $ easy_install celeryIf you have downloaded a source tarball you can install itby doing the following,::    $ python setup.py build    # python setup.py install # as rootUsage=====Installing RabbitMQ-------------------See `Installing RabbitMQ`_ over at RabbitMQ's website. For Mac OS Xsee `Installing RabbitMQ on OS X`_... _`Installing RabbitMQ`: http://www.rabbitmq.com/install.html.. _`Installing RabbitMQ on OS X`:    http://playtype.net/past/2008/10/9/installing_rabbitmq_on_osx/Setting up RabbitMQ-------------------To use celery we need to create a RabbitMQ user, a virtual host andallow that user access to that virtual host::    $ rabbitmqctl add_user myuser mypassword    $ rabbitmqctl add_vhost myvhost    $ rabbitmqctl map_user_vhost myuser myvhostConfiguring your Django project to use Celery---------------------------------------------You only need three simple steps to use celery with your Django project.    1. Add ``celery`` to ``INSTALLED_APPS``.    2. Create the celery database tables::            $ python manage.py syncdb    3. Configure celery to use the AMQP user and virtual host we created        before, by adding the following to your ``settings.py``::            AMQP_HOST = "localhost"            AMQP_PORT = 5672            AMQP_USER = "myuser"            AMQP_PASSWORD = "mypassword"            AMQP_VHOST = "myvhost"That's it.There are more options available, like how many processes you want to processwork in parallel (the ``CELERY_CONCURRENCY`` setting), and the backend usedfor storing task statuses. But for now, this should do. For all of the optionsavailable, please consult the `API Reference`_**Note**: If you're using SQLite as the Django database back-end,``celeryd`` will only be able to process one task at a time, this isbecause SQLite doesn't allow concurrent writes.Running the celery worker daemon--------------------------------To test this we'll be running the worker daemon in the foreground, so we cansee what's going on without consulting the logfile::    $ python manage.py celerydHowever, in production you'll probably want to run the worker in thebackground as a daemon instead::    $ python manage.py celeryd --daemonFor help on command line arguments to the worker daemon, you can execute thehelp command::    $ python manage.py help celerydDefining and executing tasks----------------------------**Please note** All of these tasks has to be stored in a real module, they can'tbe defined in the python shell or ipython/bpython. This is because the celeryworker server needs access to the task function to be able to run it.So while it looks like we use the python shell to define the tasks in theseexamples, you can't do it this way. Put them in the ``tasks`` module of yourDjango application. The worker daemon will automatically load any ``tasks.py``file for all of the applications listed in ``settings.INSTALLED_APPS``.Executing tasks using ``delay`` and ``apply_async`` can be done from thepython shell, but keep in mind that since arguments are pickled, you can'tuse custom classes defined in the shell session.While you can use regular functions, the recommended way is to definea task class. With this way you can cleanly upgrade the task to use the moreadvanced features of celery later.This is a task that basically does nothing but take some arguments,and return a value:    >>> class MyTask(Task):    ...     name = "myapp.mytask"    ...     def run(self, some_arg, **kwargs):    ...         logger = self.get_logger(**kwargs)    ...         logger.info("Did something: %s" % some_arg)    ...         return 42    >>> tasks.register(MyTask)Now if we want to execute this task, we can use the ``delay`` method of thetask class (this is a handy shortcut to the ``apply_async`` method which givesyou greater control of the task execution).    >>> from myapp.tasks import MyTask    >>> MyTask.delay(some_arg="foo")At this point, the task has been sent to the message broker. The messagebroker will hold on to the task until a celery worker server has successfullypicked it up.Right now we have to check the celery worker logfiles to know what happened withthe task. This is because we didn't keep the ``AsyncResult`` object returnedby ``delay``.The ``AsyncResult`` lets us find the state of the task, wait for the task tofinish and get its return value (or exception if the task failed).So, let's execute the task again, but this time we'll keep track of the task:    >>> result = MyTask.delay("do_something", some_arg="foo bar baz")    >>> result.ready() # returns True if the task has finished processing.    False    >>> result.result # task is not ready, so no return value yet.    None    >>> result.get()   # Waits until the task is done and return the retval.    42    >>> result.result    42    >>> result.success() # returns True if the task didn't end in failure.    TrueIf the task raises an exception, the ``result.success()`` will be ``False``,and ``result.result`` will contain the exception instance raised.Auto-discovery of tasks-----------------------``celery`` has an auto-discovery feature like the Django Admin, thatautomatically loads any ``tasks.py`` module in the applications listedin ``settings.INSTALLED_APPS``. This autodiscovery is used by the celeryworker to find registered tasks for your Django project.Periodic Tasks---------------Periodic tasks are tasks that are run every ``n`` seconds. Here's an example of a periodic task:    >>> from celery.task import tasks, PeriodicTask    >>> from datetime import timedelta    >>> class MyPeriodicTask(PeriodicTask):    ...     name = "foo.my-periodic-task"    ...     run_every = timedelta(seconds=30)    ...    ...     def run(self, **kwargs):    ...         logger = self.get_logger(**kwargs)    ...         logger.info("Running periodic task!")    ...    >>> tasks.register(MyPeriodicTask)**Note:** Periodic tasks does not support arguments, as this doesn'treally make sense.License=======This software is licensed under the ``New BSD License``. See the ``LICENSE``file in the top distribution directory for the full license text... # vim: syntax=rst expandtab tabstop=4 shiftwidth=4 shiftround
 |