executing.rst 7.8 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259
  1. .. _guide-executing:
  2. =================
  3. Executing Tasks
  4. =================
  5. .. contents::
  6. :local:
  7. .. _executing-basics:
  8. Basics
  9. ======
  10. Executing tasks is done with :meth:`~celery.task.Base.Task.apply_async`,
  11. and its shortcut: :meth:`~celery.task.Base.Task.delay`.
  12. ``delay`` is simple and convenient, as it looks like calling a regular
  13. function:
  14. .. code-block:: python
  15. Task.delay(arg1, arg2, kwarg1="x", kwarg2="y")
  16. The same thing using ``apply_async`` is written like this:
  17. .. code-block:: python
  18. Task.apply_async(args=[arg1, arg2], kwargs={"kwarg1": "x", "kwarg2": "y"})
  19. While ``delay`` is convenient, it doesn't give you as much control as using
  20. ``apply_async``. With ``apply_async`` you can override the execution options
  21. available as attributes on the ``Task`` class: ``routing_key``, ``exchange``,
  22. ``immediate``, ``mandatory``, ``priority``, and ``serializer``.
  23. In addition you can set a countdown/eta, or provide a custom broker connection.
  24. Let's go over these in more detail. The following examples use this simple
  25. task, which adds together two numbers:
  26. .. code-block:: python
  27. @task
  28. def add(x, y):
  29. return x + y
  30. .. note::
  31. You can also execute a task by name using
  32. :func:`~celery.execute.send_task`, if you don't have access to the
  33. task class::
  34. >>> from celery.execute import send_task
  35. >>> result = send_task("tasks.add", [2, 2])
  36. >>> result.get()
  37. 4
  38. .. _executing-eta:
  39. ETA and countdown
  40. =================
  41. The ETA (estimated time of arrival) lets you set a specific date and time that
  42. is the earliest time at which your task will execute. ``countdown`` is
  43. a shortcut to set this by seconds in the future.
  44. .. code-block:: python
  45. >>> result = add.apply_async(args=[10, 10], countdown=3)
  46. >>> result.get() # this takes at least 3 seconds to return
  47. 20
  48. Note that your task is guaranteed to be executed at some time *after* the
  49. specified date and time has passed, but not necessarily at that exact time.
  50. While ``countdown`` is an integer, ``eta`` must be a :class:`~datetime.datetime` object,
  51. specifying an exact date and time in the future. This is good if you already
  52. have a :class:`~datetime.datetime` object and need to modify it with a
  53. :class:`~datetime.timedelta`, or when using time in seconds is not very readable.
  54. .. code-block:: python
  55. from datetime import datetime, timedelta
  56. def add_tomorrow(username):
  57. """Add this tomorrow."""
  58. tomorrow = datetime.now() + timedelta(days=1)
  59. add.apply_async(args=[10, 10], eta=tomorrow)
  60. .. _executing-serializers:
  61. Serializers
  62. ===========
  63. Data passed between celery and workers has to be serialized to be
  64. transferred. The default serializer is :mod:`pickle`, but you can
  65. change this for each
  66. task. There is built-in support for using :mod:`pickle`, ``JSON``, ``YAML``
  67. and ``msgpack``. You can also add your own custom serializers by registering
  68. them into the Carrot serializer registry.
  69. The default serializer (pickle) supports Python objects, like ``datetime`` and
  70. any custom datatypes you define yourself. But since pickle has poor support
  71. outside of the Python language, you need to choose another serializer if you
  72. need to communicate with other languages. In that case, ``JSON`` is a very
  73. popular choice.
  74. The serialization method is sent with the message, so the worker knows how to
  75. deserialize any task. Of course, if you use a custom serializer, this must
  76. also be registered in the worker.
  77. When sending a task the serialization method is taken from the following
  78. places in order: The ``serializer`` argument to ``apply_async``, the
  79. Task's ``serializer`` attribute, and finally the global default
  80. :setting:`CELERY_TASK_SERIALIZER` configuration directive.
  81. .. code-block:: python
  82. >>> add.apply_async(args=[10, 10], serializer="json")
  83. .. _executing-connections:
  84. Connections and connection timeouts.
  85. ====================================
  86. Currently there is no support for broker connection pools in celery,
  87. so this is something you need to be aware of when sending more than
  88. one task at a time, as ``apply_async``/``delay`` establishes and
  89. closes a connection every time.
  90. If you need to send more than one task at the same time, it's a good idea to
  91. establish the connection yourself and pass it to ``apply_async``:
  92. .. code-block:: python
  93. numbers = [(2, 2), (4, 4), (8, 8), (16, 16)]
  94. results = []
  95. publisher = add.get_publisher()
  96. try:
  97. for args in numbers:
  98. res = add.apply_async(args=args, publisher=publisher)
  99. results.append(res)
  100. finally:
  101. publisher.close()
  102. publisher.connection.close()
  103. print([res.get() for res in results])
  104. The connection timeout is the number of seconds to wait before we give up
  105. establishing the connection. You can set this with the ``connect_timeout``
  106. argument to ``apply_async``:
  107. .. code-block:: python
  108. add.apply_async([10, 10], connect_timeout=3)
  109. Or if you handle the connection manually:
  110. .. code-block:: python
  111. publisher = add.get_publisher(connect_timeout=3)
  112. .. _executing-routing:
  113. Routing options
  114. ===============
  115. Celery uses the AMQP routing mechanisms to route tasks to different workers.
  116. You can route tasks using the following entities: exchange, queue and routing key.
  117. Messages (tasks) are sent to exchanges, a queue binds to an exchange with a
  118. routing key. Let's look at an example:
  119. Our application has a lot of tasks, some process video, others process images,
  120. and some gather collective intelligence about users. Some of these have
  121. higher priority than others so we want to make sure the high priority tasks
  122. get sent to powerful machines, while low priority tasks are sent to dedicated
  123. machines that can handle these at their own pace.
  124. For the sake of example we have only one exchange called ``tasks``.
  125. There are different types of exchanges that matches the routing key in
  126. different ways, the exchange types are:
  127. * direct
  128. Matches the routing key exactly.
  129. * topic
  130. In the topic exchange the routing key is made up of words separated by dots (``.``).
  131. Words can be matched by the wild cards ``*`` and ``#``, where ``*`` matches one
  132. exact word, and ``#`` matches one or many.
  133. For example, ``*.stock.#`` matches the routing keys ``usd.stock`` and
  134. ``euro.stock.db`` but not ``stock.nasdaq``.
  135. (there are also other exchange types, but these are not used by celery)
  136. So, we create three queues, ``video``, ``image`` and ``lowpri`` that bind to
  137. our ``tasks`` exchange. For the queues we use the following binding keys::
  138. video: video.#
  139. image: image.#
  140. lowpri: misc.#
  141. Now we can send our tasks to different worker machines, by making the workers
  142. listen to different queues:
  143. .. code-block:: python
  144. >>> add.apply_async(args=[filename],
  145. ... routing_key="video.compress")
  146. >>> add.apply_async(args=[filename, 360],
  147. ... routing_key="image.rotate")
  148. >>> add.apply_async(args=[filename, selection],
  149. ... routing_key="image.crop")
  150. >>> add.apply_async(routing_key="misc.recommend")
  151. Later, if the crop task is consuming a lot of resources,
  152. we can bind some new workers to handle just the ``"image.crop"`` task,
  153. by creating a new queue that binds to ``"image.crop``".
  154. .. seealso::
  155. To find out more about routing, please see :ref:`guide-routing`.
  156. .. _executing-amq-opts:
  157. AMQP options
  158. ============
  159. .. warning::
  160. The ``mandatory`` and ``immediate`` flags are not supported by
  161. :mod:`amqplib` at this point.
  162. * mandatory
  163. This sets the delivery to be mandatory. An exception will be raised
  164. if there are no running workers able to take on the task.
  165. * immediate
  166. Request immediate delivery. Will raise an exception
  167. if the task cannot be routed to a worker immediately.
  168. * priority
  169. A number between ``0`` and ``9``, where ``0`` is the highest priority.
  170. .. note::
  171. RabbitMQ does not yet support AMQP priorities.