whatsnew-2.6.rst 13 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448
  1. .. _whatsnew-2.6:
  2. ==========================
  3. What's new in Celery 2.6
  4. ==========================
  5. Celery aims to be a flexible and reliable, best-of-breed solution
  6. to process vast amounts of messages in a distributed fashion, while
  7. providing operations with the tools to maintain such a system.
  8. Celery has a large and diverse community of users and contributors,
  9. you should come join us :ref:`on IRC <irc-channel>`
  10. or :ref:`our mailing-list <mailing-list>`.
  11. To read more about Celery you should visit our `website`_.
  12. While this version is backward compatible with previous versions
  13. it is important that you read the following section.
  14. If you use Celery in combination with Django you must also
  15. read the `django-celery changelog`_ and upgrade to `django-celery 2.6`_.
  16. This version is officially supported on CPython 2.5, 2.6, 2.7, 3.2 and 3.3,
  17. as well as PyPy and Jython.
  18. .. _`website`: http://celeryproject.org/
  19. .. _`django-celery changelog`: http://bit.ly/djcelery-26-changelog
  20. .. _`django-celery 2.6`: http://pypi.python.org/pypi/django-celery/
  21. .. contents::
  22. :local:
  23. .. _v260-important:
  24. Important Notes
  25. ===============
  26. Now depends on :mod:`billiard`.
  27. -------------------------------
  28. Billiard is a fork of the multiprocessing containing
  29. the no-execv patch by sbt (http://bugs.python.org/issue8713),
  30. and also contains the pool improvements previously located in Celery.
  31. This fork was necessary as changes to the C extension code was required
  32. for the no-execv patch to work.
  33. - Issue #625
  34. - Issue #627
  35. - Issue #640
  36. - `django-celery #122 <http://github.com/ask/django-celery/issues/122`
  37. - `django-celery #124 <http://github.com/ask/django-celery/issues/122`
  38. `group`/`chord`/`chain` are now subtasks
  39. ----------------------------------------
  40. - The source code for these, including subtask, has been moved
  41. to new module celery.canvas.
  42. - group is no longer an alias to TaskSet, but new alltogether,
  43. since it was very difficult to migrate the TaskSet class to become
  44. a subtask.
  45. - A new shortcut has been added to tasks::
  46. >>> task.s(arg1, arg2, kw=1)
  47. as a shortcut to::
  48. >>> task.subtask((arg1, arg2), {"kw": 1})
  49. - Tasks can be chained by using the ``|`` operator::
  50. >>> (add.s(2, 2), pow.s(2)).apply_async()
  51. - Subtasks can be "evaluated" using the ``~`` operator::
  52. >>> ~add.s(2, 2)
  53. 4
  54. >>> ~(add.s(2, 2) | pow.s(2))
  55. is the same as::
  56. >>> chain(add.s(2, 2), pow.s(2)).apply_async().get()
  57. - A new subtask_type key has been added to the subtask dicts
  58. This can be the string "chord", "group", "chain", "chunks",
  59. "xmap", or "xstarmap".
  60. - maybe_subtask now uses subtask_type to reconstruct
  61. the object, to be used when using non-pickle serializers.
  62. - The logic for these operations have been moved to dedicated
  63. tasks celery.chord, celery.chain and celery.group.
  64. - subtask no longer inherits from AttributeDict.
  65. It's now a pure dict subclass with properties for attribute
  66. access to the relevant keys.
  67. - The repr's now outputs how the sequence would like imperatively::
  68. >>> from celery import chord
  69. >>> (chord([add.s(i, i) for i in xrange(10)], xsum.s())
  70. | pow.s(2))
  71. tasks.xsum([tasks.add(0, 0),
  72. tasks.add(1, 1),
  73. tasks.add(2, 2),
  74. tasks.add(3, 3),
  75. tasks.add(4, 4),
  76. tasks.add(5, 5),
  77. tasks.add(6, 6),
  78. tasks.add(7, 7),
  79. tasks.add(8, 8),
  80. tasks.add(9, 9)]) | tasks.pow(2)
  81. * New :setting:`CELERYD_WORKER_LOST_WAIT` to control the timeout in
  82. seconds before :exc:`billiard.WorkerLostError` is raised
  83. when a worker can not be signalled (Issue #595).
  84. Contributed by Brendon Crawford.
  85. * App instance factory methods have been converted to be cached
  86. descriptors that creates a new subclass on access.
  87. This means that e.g. ``celery.Worker`` is an actual class
  88. and will work as expected when::
  89. class Worker(celery.Worker):
  90. ...
  91. Logging Improvements
  92. --------------------
  93. Logging support now conforms better with best practices.
  94. - Classes used by the worker no longer uses app.get_default_logger, but uses
  95. `celery.utils.log.get_logger` which simply gets the logger not setting the
  96. level, and adds a NullHandler.
  97. - Loggers are no longer passed around, instead every module using logging
  98. defines a module global logger that is used throughout.
  99. - All loggers inherit from a common logger called "celery".
  100. - Before task.get_logger would setup a new logger for every task,
  101. and even set the loglevel. This is no longer the case.
  102. - Instead all task loggers now inherit from a common "celery.task" logger
  103. that is set up when programs call `setup_logging_subsystem`.
  104. - Instead of using LoggerAdapter to augment the formatter with
  105. the task_id and task_name field, the task base logger now use
  106. a special formatter adding these values at runtime from the
  107. currently executing task.
  108. - Redirected output from stdout/stderr is now logged to a "celery.redirected"
  109. logger.
  110. - In addition a few warnings.warn have been replaced with logger.warn.
  111. - Now avoids the 'no handlers for logger multiprocessing' warning
  112. Unorganized
  113. -----------
  114. * Task registry is no longer a global.
  115. * celery.task.Task is no longer bound to an app by default,
  116. so configuration of the task is lazy.
  117. * The @task decorator is now lazy when used with custom apps
  118. If ``accept_magic_kwargs`` is enabled (herby called "compat mode"), the task
  119. decorator executes inline like before, however for custom apps the @task
  120. decorator now returns a special PromiseProxy object that is only evaluated
  121. on access.
  122. All promises will be evaluated when `app.finalize` is called, or implicitly
  123. when the task registry is first used.
  124. * chain: Chain tasks together using callbacks under the hood.
  125. .. code-block:: python
  126. from celery import chain
  127. # (2 + 2) * 8 / 2
  128. res = chain(add.subtask((4, 4)),
  129. mul.subtask((8, )),
  130. div.subtask((2,))).apply_async()
  131. res.get() == 16
  132. res.parent.get() == 32
  133. res.parent.parent.get() == 4
  134. * The Celery instance can now be created with a broker URL
  135. .. code-block:: python
  136. celery = Celery(broker="redis://")
  137. * Result backends can now be set using an URL
  138. Currently only supported by redis. Example use::
  139. CELERY_RESULT_BACKEND = "redis://localhost/1"
  140. * Heartbeat frequency now every 5s, and frequency sent with event
  141. The heartbeat frequency is now available in the worker event messages,
  142. so that clients can decide when to consider workers offline based on
  143. this value.
  144. * Module celery.actors has been removed, and will be part of cl instead.
  145. * Introduces new ``celery`` command, which is an entrypoint for all other
  146. commands.
  147. The main for this command can be run by calling ``celery.start()``.
  148. * Tasks can now have callbacks and errbacks, and dependencies are recorded
  149. - The task message format have been updated with two new extension keys
  150. Both keys can be empty/undefined or a list of subtasks.
  151. - ``callbacks``
  152. Applied if the task exits successfully, with the result
  153. of the task as an argument.
  154. - ``errbacks``
  155. Applied if an error occurred while executing the task,
  156. with the uuid of the task as an argument. Since it may not be possible
  157. to serialize the exception instance, it passes the uuid of the task
  158. instead. The uuid can then be used to retrieve the exception and
  159. traceback of the task from the result backend.
  160. - ``link`` and ``link_error`` keyword arguments has been added
  161. to ``apply_async``.
  162. The value passed can be either a subtask or a list of
  163. subtasks:
  164. .. code-block:: python
  165. add.apply_async((2, 2), link=mul.subtask())
  166. add.apply_async((2, 2), link=[mul.subtask(), echo.subtask()])
  167. Example error callback:
  168. .. code-block:: python
  169. @task
  170. def error_handler(uuid):
  171. result = AsyncResult(uuid)
  172. exc = result.get(propagate=False)
  173. print("Task %r raised exception: %r\n%r" % (
  174. exc, result.traceback))
  175. >>> add.apply_async((2, 2), link_error=error_handler)
  176. - We now track what subtasks a task sends, and some result backends
  177. supports retrieving this information.
  178. - task.request.children
  179. Contains the result instances of the subtasks
  180. the currently executing task has applied.
  181. - AsyncResult.children
  182. Returns the tasks dependencies, as a list of
  183. ``AsyncResult``/``ResultSet`` instances.
  184. - AsyncResult.iterdeps
  185. Recursively iterates over the tasks dependencies,
  186. yielding `(parent, node)` tuples.
  187. Raises IncompleteStream if any of the dependencies
  188. has not returned yet.
  189. - AsyncResult.graph
  190. A ``DependencyGraph`` of the tasks dependencies.
  191. This can also be used to convert to dot format:
  192. .. code-block:: python
  193. with open("graph.dot") as fh:
  194. result.graph.to_dot(fh)
  195. which can than be used to produce an image::
  196. $ dot -Tpng graph.dot -o graph.png
  197. * Bugreport now available as a command and broadcast command
  198. - Get it from a Python repl::
  199. >>> import celery
  200. >>> print(celery.bugreport())
  201. - Use celeryctl::
  202. $ celeryctl report
  203. - Get it from remote workers::
  204. $ celeryctl inspect report
  205. * Module ``celery.log`` moved to :mod:`celery.app.log`.
  206. * Module ``celery.task.control`` moved to :mod:`celery.app.control`.
  207. * Adds :meth:`AsyncResult.get_leaf`
  208. Waits and returns the result of the leaf subtask.
  209. That is the last node found when traversing the graph,
  210. but this means that the graph can be 1-dimensional only (in effect
  211. a list).
  212. * Adds ``subtask.link(subtask)`` + ``subtask.link_error(subtask)``
  213. Shortcut to ``s.options.setdefault("link", []).append(subtask)``
  214. * Adds ``subtask.flatten_links()``
  215. Returns a flattened list of all dependencies (recursively)
  216. * ``AsyncResult.task_id`` renamed to ``AsyncResult.id``
  217. * ``TasksetResult.taskset_id`` renamed to ``.id``
  218. * ``xmap(task, sequence)`` and ``xstarmap(task, sequence)``
  219. Returns a list of the results applying the task to every item
  220. in the sequence.
  221. Example::
  222. >>> from celery import xstarmap
  223. >>> xstarmap(add, zip(range(10), range(10)).apply_async()
  224. [0, 2, 4, 6, 8, 10, 12, 14, 16, 18]
  225. * ``chunks(task, sequence, chunksize)``
  226. * ``group.skew()``
  227. * 99% Coverage
  228. * :setting:`CELERY_QUEUES` can now be a list/tuple of :class:`~kombu.Queue`
  229. instances.
  230. Internally :attr:`@amqp.queues` is now a mapping of name/Queue instances,
  231. instead of converting on the fly.
  232. * Can now specify connection for :class:`@control.inspect`.
  233. .. code-block:: python
  234. i = celery.control.inspect(connection=BrokerConnection("redis://"))
  235. i.active_queues()
  236. * Module :mod:`celery.app.task` is now a module instead of a package.
  237. The setup.py install script will try to remove the old package,
  238. if that doesn't work for some reason you have to remove
  239. it manually, you can do so by executing the command::
  240. $ rm -r $(dirname $(python -c '
  241. import celery;print(celery.__file__)'))/app/task/
  242. * :setting:`CELERY_FORCE_EXECV` is now enabled by default.
  243. If the old behavior is wanted the setting can be set to False,
  244. or the new :option:`--no-execv` to :program:`celeryd`.
  245. * Deprecated module ``celery.conf`` has been removed.
  246. * The :setting:`CELERY_TIMEZONE` now always require the :mod:`pytz`
  247. library to be installed (exept if the timezone is set to `UTC`).
  248. * The Tokyo Tyrant backend has been removed and is no longer supported.
  249. * Now uses :func:`~kombu.common.maybe_declare` to cache queue declarations.
  250. * There is no longer a global default for the
  251. :setting:`CELERYBEAT_MAX_LOOP_INTERVAL` setting, it is instead
  252. set by individual schedulers.
  253. Internals
  254. ---------
  255. * Compat modules are now generated dynamically upon use.
  256. These modules are ``celery.messaging``, ``celery.log``,
  257. ``celery.decorators`` and ``celery.registry``.
  258. * :mod:`celery.utils` refactored into multiple modules:
  259. :mod:`celery.utils.text`
  260. :mod:`celery.utils.imports`
  261. :mod:`celery.utils.functional`
  262. * Now using :mod:`kombu.utils.encoding` instead of
  263. `:mod:`celery.utils.encoding`.
  264. * Renamed module ``celery.routes`` -> :mod:`celery.app.routes`.
  265. * Renamed package ``celery.db`` -> :mod:`celery.backends.database`.
  266. * Renamed module ``celery.abstract`` -> :mod:`celery.worker.abstract`.
  267. * Command-line docs are now parsed from the module docstrings.
  268. * Test suite directory has been reorganized.
  269. * :program:`setup.py` now reads docs from the :file:`requirements/` directory.
  270. .. _v260-deprecations:
  271. Deprecations
  272. ============
  273. .. _v260-news:
  274. News
  275. ====
  276. In Other News
  277. -------------
  278. - Now depends on Kombu 2.1.4
  279. Fixes
  280. =====