| 123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235 | .. _tut-celery:======================== First steps with Celery========================.. contents::    :local:.. _celerytut-broker:Choosing your Broker====================Before you can use Celery you need to choose, install and run a broker.The broker is the service responsible for receiving and delivering taskmessages.There are several choices available, including:* :ref:`broker-rabbitmq``RabbitMQ`_ is feature-complete, safe and durable. If not losing tasksis important to you, then this is your best option.* :ref:`broker-redis``Redis`_ is also feature-complete, but power failures or abrupt terminationmay result in data loss.* :ref:`broker-sqlalchemy`* :ref:`broker-django`Using a database as a message queue is not recommended, but can be sufficientfor very small installations.  Celery can use the SQLAlchemy and Django ORM.* and more.In addition to the above, there are several other transport implementationsto choose from, including :ref:`broker-couchdb`, :ref:`broker-beanstalk`,:ref:`broker-mongodb`, and SQS.  There is a `Transport Comparison`_in the Kombu documentation... _`RabbitMQ`: http://www.rabbitmq.com/.. _`Redis`: http://redis.io/.. _`Transport Comparison`: http://kombu.rtfd.org/transport-comparison.. _celerytut-simple-tasks:Creating a simple task======================In this tutorial we are creating a simple task that adds twonumbers.  Tasks are defined in normal Python modules.By convention we will call our module :file:`tasks.py`, and it lookslike this::file: `tasks.py`.. code-block:: python    from celery.task import task    @task    def add(x, y):        return x + yBehind the scenes the `@task` decorator actually creates a class thatinherits from :class:`~celery.task.base.Task`.  The best practice is toonly create custom task classes when you want to change generic behavior,and use the decorator to define tasks... seealso::    The full documentation on how to create tasks and task classes is in the    :doc:`../userguide/tasks` part of the user guide... _celerytut-conf:Configuration=============Celery is configured by using a configuration module.  By defaultthis module is called :file:`celeryconfig.py`.The configuration module must either be in the current directoryor on the Python path, so that it can be imported.You can also set a custom name for the configuration module by usingthe :envvar:`CELERY_CONFIG_MODULE` environment variable.Let's create our :file:`celeryconfig.py`.1. Configure how we communicate with the broker (RabbitMQ in this example)::        BROKER_URL = "amqp://guest:guest@localhost:5672//"2. Define the backend used to store task metadata and return values::        CELERY_RESULT_BACKEND = "amqp"   The AMQP backend is non-persistent by default, and you can only   fetch the result of a task once (as it's sent as a message).   For list of backends available and related options see   :ref:`conf-result-backend`.3. Finally we list the modules the worker should import.  This includes   the modules containing your tasks.   We only have a single task module, :file:`tasks.py`, which we added earlier::        CELERY_IMPORTS = ("tasks", )That's it.There are more options available, like how many processes you want touse to process work in parallel (the :setting:`CELERY_CONCURRENCY` setting),and we could use a persistent result store backend, but for now, this shoulddo.  For all of the options available, see :ref:`configuration`... note::    You can also specify modules to import using the :option:`-I` option to    :mod:`~celery.bin.celeryd`::        $ celeryd -l info -I tasks,handlers    This can be a single, or a comma separated list of task modules to import    when :program:`celeryd` starts... _celerytut-running-celeryd:Running the celery worker server================================To test we will run the worker server in the foreground, so we cansee what's going on in the terminal::    $ celeryd --loglevel=INFOIn production you will probably want to run the worker in thebackground as a daemon.  To do this you need to use the tools providedby your platform, or something like `supervisord`_ (see :ref:`daemonizing`for more information).For a complete listing of the command line options available, do::    $  celeryd --help.. _`supervisord`: http://supervisord.org.. _celerytut-executing-task:Executing the task==================Whenever we want to execute our task, we use the:meth:`~celery.task.base.Task.delay` method of the task class.This is a handy shortcut to the :meth:`~celery.task.base.Task.apply_async`method which gives greater control of the task execution (see:ref:`guide-executing`).    >>> from tasks import add    >>> add.delay(4, 4)    <AsyncResult: 889143a6-39a2-4e52-837b-d80d33efb22d>At this point, the task has been sent to the message broker. The messagebroker will hold on to the task until a worker server has consumed andexecuted it.Right now we have to check the worker log files to know what happenedwith the task.  Applying a task returns an:class:`~celery.result.AsyncResult`, if you have configured a result storethe :class:`~celery.result.AsyncResult` enables you to check the state ofthe task, wait for the task to finish, get its return valueor exception/traceback if the task failed, and more.Keeping Results---------------If you want to keep track of the tasks state, Celery needs to store or sendthe states somewhere.  There are severalbuilt-in backends to choose from: SQLAlchemy/Django ORM, Memcached, Redis,AMQP, MongoDB, Tokyo Tyrant and Redis -- or you can define your own.For this example we will use the `amqp` result backend, which sends statesas messages.  The backend is configured via the ``CELERY_RESULT_BACKEND``option, in addition individual result backends may have additional settingsyou can configure::    CELERY_RESULT_BACKEND = "amqp"    #: We want the results to expire in 5 minutes, note that this requires    #: RabbitMQ version 2.1.1 or higher, so please comment out if you have    #: an earlier version.    CELERY_TASK_RESULT_EXPIRES = 300To read more about result backends please see :ref:`task-result-backends`.Now with the result backend configured, let's execute the task again.This time we'll hold on to the :class:`~celery.result.AsyncResult`::    >>> result = add.delay(4, 4)Here's some examples of what you can do when you have results::    >>> result.ready() # returns True if the task has finished processing.    False    >>> result.result # task is not ready, so no return value yet.    None    >>> result.get()   # Waits until the task is done and returns the retval.    8    >>> result.result # direct access to result, doesn't re-raise errors.    8    >>> result.successful() # returns True if the task didn't end in failure.    TrueIf the task raises an exception, the return value of `result.successful()`will be :const:`False`, and `result.result` will contain the exception instanceraised by the task.Where to go from here=====================After this you should read the :ref:`guide`. Specifically:ref:`guide-tasks` and :ref:`guide-executing`.
 |