first-steps-with-python.rst 4.6 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144
  1. =========================
  2. First steps with Python
  3. =========================
  4. Creating a simple task
  5. ======================
  6. We put tasks in a dedicated ``tasks.py`` module. Your tasks can be in
  7. any module, but it's a good convention.
  8. Our task is simple, just adding two numbers
  9. ``tasks.py``:
  10. .. code-block:: python
  11. from celery.decorators import task
  12. @task
  13. def add(x, y):
  14. return x + y
  15. Tasks in celery are actually classes inheriting from the ``Task`` class.
  16. When you create a new task it is automatically registered in a registry, but
  17. for this to happen in the worker you need to give a list of modules the worker
  18. should import.
  19. Configuration
  20. =============
  21. Celery needs a configuration module, usually called ``celeryconfig.py``.
  22. This module must be importable and located in the Python path.
  23. You can set a custom name for the configuration module with the
  24. ``CELERY_CONFIG_MODULE`` variable. In these examples we use the default name.
  25. Let's create our ``celeryconfig.py``.
  26. 1. Start by making sure Python is able to import modules from the current
  27. directory::
  28. import os
  29. import sys
  30. sys.path.insert(0, os.getcwd())
  31. 2. Configure the broker::
  32. BROKER_HOST = "localhost"
  33. BROKER_PORT = 5672
  34. BROKER_USER = "myuser"
  35. BROKER_PASSWORD = "mypassword"
  36. BROKER_VHOST = "myvhost"
  37. 3. We don't want to store the results, so we'll just use the simplest
  38. backend available; the AMQP backend::
  39. CELERY_BACKEND = "amqp"
  40. 4. Finally, we list the modules to import. We only have a single module; the
  41. ``tasks.py`` module we added earlier::
  42. CELERY_IMPORTS = ("tasks", )
  43. That's it.
  44. There are more options available, like how many processes you want to process
  45. work in parallel (the ``CELERY_CONCURRENCY`` setting), and we could use a
  46. persistent result store backend, but for now, this should do. For all of
  47. the options available, please see the :doc:`configuration directive
  48. reference<../configuration>`.
  49. Running the celery worker server
  50. ================================
  51. To test this we'll be running the worker server in the foreground, so we can
  52. see what's going on without consulting the logfile::
  53. $ celeryd --loglevel=INFO
  54. However, in production you probably want to run the worker in the
  55. background as a daemon. To do this you need to use to tools provided by your
  56. platform, or something like `supervisord`_.
  57. For example startup scripts see ``contrib/debian/init.d`` for using
  58. ``start-stop-daemon`` on Debian/Ubuntu, or ``contrib/mac/org.celeryq.*`` for using
  59. ``launchd`` on Mac OS X.
  60. .. _`supervisord`: http://supervisord.org/
  61. For a complete listing of the command line arguments available, with a short
  62. description, you can use the help command::
  63. $ celeryd --help
  64. Executing the task
  65. ==================
  66. Now if we want to execute our task, we can use the
  67. ``delay`` method of the task class.
  68. This is a handy shortcut to the ``apply_async`` method which gives
  69. greater control of the task execution.
  70. See :doc:`Executing Tasks<../userguide/executing>` for more information.
  71. >>> from tasks import add
  72. >>> add.delay(4, 4)
  73. <AsyncResult: 889143a6-39a2-4e52-837b-d80d33efb22d>
  74. At this point, the task has been sent to the message broker. The message
  75. broker will hold on to the task until a celery worker server has successfully
  76. picked it up.
  77. *Note* If everything is just hanging when you execute ``delay``, please check
  78. that RabbitMQ is running, and that the user/password has access to the virtual
  79. host you configured earlier.
  80. Right now we have to check the celery worker logfiles to know what happened
  81. with the task. This is because we didn't keep the ``AsyncResult`` object
  82. returned by ``delay``.
  83. The ``AsyncResult`` lets us find the state of the task, wait for the task to
  84. finish and get its return value (or exception if the task failed).
  85. So, let's execute the task again, but this time we'll keep track of the task:
  86. >>> result = add.delay(4, 4)
  87. >>> result.ready() # returns True if the task has finished processing.
  88. False
  89. >>> result.result # task is not ready, so no return value yet.
  90. None
  91. >>> result.get() # Waits until the task is done and returns the retval.
  92. 8
  93. >>> result.result # direct access to result, doesn't re-raise errors.
  94. 8
  95. >>> result.successful() # returns True if the task didn't end in failure.
  96. True
  97. If the task raises an exception, the return value of ``result.successful()``
  98. will be ``False``, and ``result.result`` will contain the exception instance
  99. raised by the task.
  100. That's all for now! After this you should probably read the :doc:`User
  101. Guide<../userguide/index>`.