first-steps-with-django.rst 4.5 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127
  1. =========================
  2. First steps with Django
  3. =========================
  4. Configuring your Django project to use Celery
  5. =============================================
  6. You only need three simple steps to use celery with your Django project.
  7. 1. Add ``celery`` to ``INSTALLED_APPS``.
  8. 2. Create the celery database tables::
  9. $ python manage.py syncdb
  10. 3. Configure celery to use the AMQP user and virtual host we created
  11. before, by adding the following to your ``settings.py``::
  12. BROKER_HOST = "localhost"
  13. BROKER_PORT = 5672
  14. BROKER_USER = "myuser"
  15. BROKER_PASSWORD = "mypassword"
  16. BROKER_VHOST = "myvhost"
  17. That's it.
  18. There are more options available, like how many processes you want to process
  19. work in parallel (the ``CELERY_CONCURRENCY`` setting), and the backend used
  20. for storing task statuses. But for now, this should do. For all of the options
  21. available, please see the :doc:`configuration directive
  22. reference<../configuration>`.
  23. **Note**: If you're using SQLite as the Django database back-end,
  24. ``celeryd`` will only be able to process one task at a time, this is
  25. because SQLite doesn't allow concurrent writes.
  26. Running the celery worker server
  27. ================================
  28. To test this we'll be running the worker server in the foreground, so we can
  29. see what's going on without consulting the logfile::
  30. $ python manage.py celeryd
  31. However, in production you probably want to run the worker in the
  32. background as a daemon. To do this you need to use to tools provided by your
  33. platform, or something like `supervisord`_.
  34. For example startup scripts see ``contrib/debian/init.d`` for using
  35. ``start-stop-daemon`` on Debian/Ubuntu, or ``contrib/mac/org.celeryq.*`` for using
  36. ``launchd`` on Mac OS X.
  37. .. _`supervisord`: http://supervisord.org/
  38. For a complete listing of the command line arguments available, with a short
  39. description, you can use the help command::
  40. $ python manage.py help celeryd
  41. Defining and executing tasks
  42. ============================
  43. **Please note** All of these tasks has to be stored in a real module, they can't
  44. be defined in the python shell or ipython/bpython. This is because the celery
  45. worker server needs access to the task function to be able to run it.
  46. Put them in the ``tasks`` module of your
  47. Django application. The worker server will automatically load any ``tasks.py``
  48. file for all of the applications listed in ``settings.INSTALLED_APPS``.
  49. Executing tasks using ``delay`` and ``apply_async`` can be done from the
  50. python shell, but keep in mind that since arguments are pickled, you can't
  51. use custom classes defined in the shell session.
  52. This is a task that adds two numbers:
  53. .. code-block:: python
  54. from celery.decorators import task
  55. @task()
  56. def add(x, y):
  57. return x + y
  58. Now if we want to execute this task, we can use the
  59. ``delay`` method of the task class.
  60. This is a handy shortcut to the ``apply_async`` method which gives
  61. greater control of the task execution.
  62. See :doc:`Executing Tasks<../userguide/executing>` for more information.
  63. >>> from myapp.tasks import MyTask
  64. >>> MyTask.delay(some_arg="foo")
  65. At this point, the task has been sent to the message broker. The message
  66. broker will hold on to the task until a celery worker server has successfully
  67. picked it up.
  68. *Note* If everything is just hanging when you execute ``delay``, please check
  69. that RabbitMQ is running, and that the user/password has access to the virtual
  70. host you configured earlier.
  71. Right now we have to check the celery worker logfiles to know what happened
  72. with the task. This is because we didn't keep the ``AsyncResult`` object
  73. returned by ``delay``.
  74. The ``AsyncResult`` lets us find the state of the task, wait for the task to
  75. finish and get its return value (or exception if the task failed).
  76. So, let's execute the task again, but this time we'll keep track of the task:
  77. >>> result = add.delay(4, 4)
  78. >>> result.ready() # returns True if the task has finished processing.
  79. False
  80. >>> result.result # task is not ready, so no return value yet.
  81. None
  82. >>> result.get() # Waits until the task is done and returns the retval.
  83. 8
  84. >>> result.result # direct access to result, doesn't re-raise errors.
  85. 8
  86. >>> result.successful() # returns True if the task didn't end in failure.
  87. True
  88. If the task raises an exception, the return value of ``result.successful()``
  89. will be ``False``, and ``result.result`` will contain the exception instance
  90. raised by the task.