canvas.txt 7.1 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269
  1. .. _guide-canvas:
  2. ============================
  3. Canvas: Building Workflows
  4. ============================
  5. .. contents::
  6. :local:
  7. .. _canvas-subtasks:
  8. Subtasks
  9. ========
  10. .. versionadded:: 2.0
  11. The :class:`~celery.subtask` type is used to wrap the arguments and
  12. execution options for a single task invocation:
  13. .. code-block:: python
  14. from celery import subtask
  15. subtask(task_name_or_cls, args, kwargs, options)
  16. For convenience every task also has a shortcut to create subtasks:
  17. .. code-block:: python
  18. task.subtask(args, kwargs, options)
  19. :class:`~celery.subtask` is actually a :class:`dict` subclass,
  20. which means it can be serialized with JSON or other encodings that doesn't
  21. support complex Python objects.
  22. Also it can be regarded as a type, as the following usage works::
  23. >>> s = subtask("tasks.add", args=(2, 2), kwargs={})
  24. >>> subtask(dict(s)) # coerce dict into subtask
  25. This makes it excellent as a means to pass callbacks around to tasks.
  26. .. _canvas-callbacks:
  27. Callbacks
  28. ---------
  29. Callbacks can be added to any task using the ``link`` argument
  30. to ``apply_async``:
  31. add.apply_async((2, 2), link=other_task.subtask())
  32. The callback will only be applied if the task exited successfully,
  33. and it will be applied with the return value of the parent task as argument.
  34. The best thing is that any arguments you add to `subtask`,
  35. will be prepended to the arguments specified by the subtask itself!
  36. If you have the subtask::
  37. >>> add.subtask(args=(10, ))
  38. `subtask.delay(result)` becomes::
  39. >>> add.apply_async(args=(result, 10))
  40. ...
  41. Now let's call our ``add`` task with a callback using partial
  42. arguments::
  43. >>> add.apply_async((2, 2), link=add.subtask((8, )))
  44. As expected this will first launch one task calculating :math:`2 + 2`, then
  45. another task calculating :math:`4 + 8`.
  46. .. _canvas-group:
  47. Groups
  48. ======
  49. The :class:`~celery.group` enables easy invocation of several
  50. tasks at once, and is then able to join the results in the same order as the
  51. tasks were invoked.
  52. ``group`` takes a list of :class:`~celery.subtask`'s::
  53. >>> from celery import group
  54. >>> from tasks import add
  55. >>> job = group([
  56. ... add.subtask((2, 2)),
  57. ... add.subtask((4, 4)),
  58. ... add.subtask((8, 8)),
  59. ... add.subtask((16, 16)),
  60. ... add.subtask((32, 32)),
  61. ... ])
  62. >>> result = job.apply_async()
  63. >>> result.ready() # have all subtasks completed?
  64. True
  65. >>> result.successful() # were all subtasks successful?
  66. True
  67. >>> result.join()
  68. [4, 8, 16, 32, 64]
  69. The first argument can alternatively be an iterator, like::
  70. >>> group(add.subtask((i, i)) for i in range(100))
  71. .. _canvas-group-results:
  72. Group Results
  73. -------------
  74. When a :class:`~celery.group` is applied it returns a
  75. :class:`~celery.result.GroupResult` object.
  76. :class:`~celery.result.GroupResult` takes a list of
  77. :class:`~celery.result.AsyncResult` instances and operates on them as if it was a
  78. single task.
  79. It supports the following operations:
  80. * :meth:`~celery.result.GroupResult.successful`
  81. Returns :const:`True` if all of the subtasks finished
  82. successfully (e.g. did not raise an exception).
  83. * :meth:`~celery.result.GroupResult.failed`
  84. Returns :const:`True` if any of the subtasks failed.
  85. * :meth:`~celery.result.GroupResult.waiting`
  86. Returns :const:`True` if any of the subtasks
  87. is not ready yet.
  88. * :meth:`~celery.result.GroupResult.ready`
  89. Return :const:`True` if all of the subtasks
  90. are ready.
  91. * :meth:`~celery.result.GroupResult.completed_count`
  92. Returns the number of completed subtasks.
  93. * :meth:`~celery.result.GroupResult.revoke`
  94. Revokes all of the subtasks.
  95. * :meth:`~celery.result.GroupResult.iterate`
  96. Iterates over the return values of the subtasks
  97. as they finish, one by one.
  98. * :meth:`~celery.result.GroupResult.join`
  99. Gather the results for all of the subtasks
  100. and return a list with them ordered by the order of which they
  101. were called.
  102. .. _chords:
  103. Chords
  104. ======
  105. .. versionadded:: 2.3
  106. A chord is a task that only executes after all of the tasks in a taskset has
  107. finished executing.
  108. Let's calculate the sum of the expression
  109. :math:`1 + 1 + 2 + 2 + 3 + 3 ... n + n` up to a hundred digits.
  110. First we need two tasks, :func:`add` and :func:`tsum` (:func:`sum` is
  111. already a standard function):
  112. .. code-block:: python
  113. @celery.task()
  114. def add(x, y):
  115. return x + y
  116. @celery.task()
  117. def tsum(numbers):
  118. return sum(numbers)
  119. Now we can use a chord to calculate each addition step in parallel, and then
  120. get the sum of the resulting numbers::
  121. >>> from celery import chord
  122. >>> from tasks import add, tsum
  123. >>> chord(add.subtask((i, i))
  124. ... for i in xrange(100))(tsum.subtask()).get()
  125. 9900
  126. This is obviously a very contrived example, the overhead of messaging and
  127. synchronization makes this a lot slower than its Python counterpart::
  128. sum(i + i for i in xrange(100))
  129. The synchronization step is costly, so you should avoid using chords as much
  130. as possible. Still, the chord is a powerful primitive to have in your toolbox
  131. as synchronization is a required step for many parallel algorithms.
  132. Let's break the chord expression down::
  133. >>> callback = tsum.subtask()
  134. >>> header = [add.subtask((i, i)) for i in xrange(100)]
  135. >>> result = chord(header)(callback)
  136. >>> result.get()
  137. 9900
  138. Remember, the callback can only be executed after all of the tasks in the
  139. header has returned. Each step in the header is executed as a task, in
  140. parallel, possibly on different nodes. The callback is then applied with
  141. the return value of each task in the header. The task id returned by
  142. :meth:`chord` is the id of the callback, so you can wait for it to complete
  143. and get the final return value (but remember to :ref:`never have a task wait
  144. for other tasks <task-synchronous-subtasks>`)
  145. .. _chord-important-notes:
  146. Important Notes
  147. ---------------
  148. By default the synchronization step is implemented by having a recurring task
  149. poll the completion of the taskset every second, applying the subtask when
  150. ready.
  151. Example implementation:
  152. .. code-block:: python
  153. def unlock_chord(taskset, callback, interval=1, max_retries=None):
  154. if taskset.ready():
  155. return subtask(callback).delay(taskset.join())
  156. raise unlock_chord.retry(countdown=interval, max_retries=max_retries)
  157. This is used by all result backends except Redis and Memcached, which increment a
  158. counter after each task in the header, then applying the callback when the
  159. counter exceeds the number of tasks in the set. *Note:* chords do not properly
  160. work with Redis before version 2.2; you will need to upgrade to at least 2.2 to
  161. use them.
  162. The Redis and Memcached approach is a much better solution, but not easily
  163. implemented in other backends (suggestions welcome!).
  164. .. note::
  165. If you are using chords with the Redis result backend and also overriding
  166. the :meth:`Task.after_return` method, you need to make sure to call the
  167. super method or else the chord callback will not be applied.
  168. .. code-block:: python
  169. def after_return(self, *args, **kwargs):
  170. do_something()
  171. super(MyTask, self).after_return(*args, **kwargs)