Changelog 12 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352
  1. ==============
  2. Change history
  3. ==============
  4. 0.3.7 [2008-06-16 11:41 P.M CET]
  5. -----------------------------------------------
  6. * **IMPORTANT** Now uses AMQP's ``basic.consume`` instead of
  7. ``basic.get``. This means we're no longer polling the broker for
  8. new messages.
  9. * **IMPORTANT** Default concurrency limit is now set to the number of CPUs
  10. available on the system.
  11. * **IMPORTANT** ``tasks.register``: Renamed ``task_name`` argument to
  12. ``name``, so
  13. >>> tasks.register(func, task_name="mytask")
  14. has to be replaced with:
  15. >>> tasks.register(func, name="mytask")
  16. * The daemon now correctly runs if the pidlock is stale.
  17. * Now compatible with carrot 0.4.5
  18. * Default AMQP connnection timeout is now 4 seconds.
  19. * ``AsyncResult.read()`` was always returning ``True``.
  20. * Only use README as long_description if the file exists so easy_install
  21. doesn't break.
  22. * ``celery.view``: JSON responses now properly set its mime-type.
  23. * ``apply_async`` now has a ``connection`` keyword argument so you
  24. can re-use the same AMQP connection if you want to execute
  25. more than one task.
  26. * Handle failures in task_status view such that it won't throw 500s.
  27. * Fixed typo ``AMQP_SERVER`` in documentation to ``AMQP_HOST``.
  28. * Worker exception e-mails sent to admins now works properly.
  29. * No longer depends on ``django``, so installing ``celery`` won't affect
  30. the preferred Django version installed.
  31. * Now works with PostgreSQL (psycopg2) again by registering the
  32. ``PickledObject`` field.
  33. * ``celeryd``: Added ``--detach`` option as an alias to ``--daemon``, and
  34. it's the term used in the documentation from now on.
  35. * Make sure the pool and periodic task worker thread is terminated
  36. properly at exit. (So ``Ctrl-C`` works again).
  37. * Now depends on ``python-daemon``.
  38. * Removed dependency to ``simplejson``
  39. * Cache Backend: Re-establishes connection for every task process
  40. if the Django cache backend is memcached/libmemcached.
  41. * Tyrant Backend: Now re-establishes the connection for every task
  42. executed.
  43. 0.3.3 [2009-06-08 01:07 P.M CET]
  44. -----------------------------------------------
  45. * The ``PeriodicWorkController`` now sleeps for 1 second between checking
  46. for periodic tasks to execute.
  47. 0.3.2 [2009-06-08 01:07 P.M CET]
  48. -----------------------------------------------
  49. * celeryd: Added option ``--discard``: Discard (delete!) all waiting
  50. messages in the queue.
  51. * celeryd: The ``--wakeup-after`` option was not handled as a float.
  52. 0.3.1 [2009-06-08 01:07 P.M CET]
  53. -----------------------------------------------
  54. * The `PeriodicTask`` worker is now running in its own thread instead
  55. of blocking the ``TaskController`` loop.
  56. * Default ``QUEUE_WAKEUP_AFTER`` has been lowered to ``0.1`` (was ``0.3``)
  57. 0.3.0 [2009-06-08 12:41 P.M CET]
  58. -----------------------------------------------
  59. **NOTE** This is a development version, for the stable release, please
  60. see versions 0.2.x.
  61. **VERY IMPORTANT:** Pickle is now the encoder used for serializing task
  62. arguments, so be sure to flush your task queue before you upgrade.
  63. * **IMPORTANT** TaskSet.run() now returns a celery.result.TaskSetResult
  64. instance, which lets you inspect the status and return values of a
  65. taskset as it was a single entity.
  66. * **IMPORTANT** Celery now depends on carrot >= 0.4.1.
  67. * The celery daemon now sends task errors to the registered admin e-mails.
  68. To turn off this feature, set ``SEND_CELERY_TASK_ERROR_EMAILS`` to
  69. ``False`` in your ``settings.py``. Thanks to Grégoire Cachet.
  70. * You can now run the celery daemon by using ``manage.py``::
  71. $ python manage.py celeryd
  72. Thanks to Grégoire Cachet.
  73. * Added support for message priorities, topic exchanges, custom routing
  74. keys for tasks. This means we have introduced
  75. ``celery.task.apply_async``, a new way of executing tasks.
  76. You can use ``celery.task.delay`` and ``celery.Task.delay`` like usual, but
  77. if you want greater control over the message sent, you want
  78. ``celery.task.apply_async`` and ``celery.Task.apply_async``.
  79. This also means the AMQP configuration has changed. Some settings has
  80. been renamed, while others are new::
  81. CELERY_AMQP_EXCHANGE
  82. CELERY_AMQP_PUBLISHER_ROUTING_KEY
  83. CELERY_AMQP_CONSUMER_ROUTING_KEY
  84. CELERY_AMQP_CONSUMER_QUEUE
  85. CELERY_AMQP_EXCHANGE_TYPE
  86. See the entry `Can I send some tasks to only some servers?`_ in the
  87. `FAQ`_ for more information.
  88. .. _`Can I send some tasks to only some servers?`:
  89. http://bit.ly/celery_AMQP_routing
  90. .. _`FAQ`: http://ask.github.com/celery/faq.html
  91. * Task errors are now logged using loglevel ``ERROR`` instead of ``INFO``,
  92. and backtraces are dumped. Thanks to Grégoire Cachet.
  93. * Make every new worker process re-establish it's Django DB connection,
  94. this solving the "MySQL connection died?" exceptions.
  95. Thanks to Vitaly Babiy and Jirka Vejrazka.
  96. * **IMOPORTANT** Now using pickle to encode task arguments. This means you
  97. now can pass complex python objects to tasks as arguments.
  98. * Removed dependency on ``yadayada``.
  99. * Added a FAQ, see ``docs/faq.rst``.
  100. * Now converts any unicode keys in task ``kwargs`` to regular strings.
  101. Thanks Vitaly Babiy.
  102. * Renamed the ``TaskDaemon`` to ``WorkController``.
  103. * ``celery.datastructures.TaskProcessQueue`` is now renamed to
  104. ``celery.pool.TaskPool``.
  105. * The pool algorithm has been refactored for greater performance and
  106. stability.
  107. 0.2.0 [2009-05-20 05:14 P.M CET]
  108. ------------------------------------------------
  109. * Final release of 0.2.0
  110. * Compatible with carrot version 0.4.0.
  111. * Fixes some syntax errors related to fetching results
  112. from the database backend.
  113. 0.2.0-pre3 [2009-05-20 05:14 P.M CET]
  114. ----------------------------------------------------
  115. * *Internal release*. Improved handling of unpickled exceptions,
  116. get_result() now tries to recreate something looking like the
  117. original exception.
  118. 0.2.0-pre2 [2009-05-20 01:56 P.M CET]
  119. ----------------------------------------------------
  120. * Now handles unpickleable exceptions (like the dynimically generated
  121. subclasses of ``django.core.exception.MultipleObjectsReturned``).
  122. 0.2.0-pre1 [2009-05-20 12:33 P.M CET]
  123. ----------------------------------------------------
  124. * It's getting quite stable, with a lot of new features, so bump
  125. version to 0.2. This is a pre-release.
  126. * ``celery.task.mark_as_read()`` and ``celery.task.mark_as_failure()`` has
  127. been removed. Use ``celery.backends.default_backend.mark_as_read()``,
  128. and ``celery.backends.default_backend.mark_as_failure()`` instead.
  129. 0.1.15 [2009-05-19 04:13 P.M CET]
  130. ------------------------------------------------
  131. * The celery daemon was leaking AMQP connections, this should be fixed,
  132. if you have any problems with too many files open (like ``emfile``
  133. errors in ``rabbit.log``, please contact us!
  134. 0.1.14 [2009-05-19 01:08 P.M CET]
  135. ------------------------------------------------
  136. * Fixed a syntax error in the ``TaskSet`` class. (No such variable
  137. ``TimeOutError``).
  138. 0.1.13 [2009-05-19 12:36 P.M CET]
  139. ------------------------------------------------
  140. * Forgot to add ``yadayada`` to install requirements.
  141. * Now deletes all expired task results, not just those marked as done.
  142. * Able to load the Tokyo Tyrant backend class without django
  143. configuration, can specify tyrant settings directly in the class
  144. constructor.
  145. * Improved API documentation
  146. * Now using the Sphinx documentation system, you can build
  147. the html documentation by doing ::
  148. $ cd docs
  149. $ make html
  150. and the result will be in ``docs/.build/html``.
  151. 0.1.12 [2009-05-18 04:38 P.M CET]
  152. ------------------------------------------------
  153. * ``delay_task()`` etc. now returns ``celery.task.AsyncResult`` object,
  154. which lets you check the result and any failure that might have
  155. happened. It kind of works like the ``multiprocessing.AsyncResult``
  156. class returned by ``multiprocessing.Pool.map_async``.
  157. * Added dmap() and dmap_async(). This works like the
  158. ``multiprocessing.Pool`` versions except they are tasks
  159. distributed to the celery server. Example:
  160. >>> from celery.task import dmap
  161. >>> import operator
  162. >>> dmap(operator.add, [[2, 2], [4, 4], [8, 8]])
  163. >>> [4, 8, 16]
  164. >>> from celery.task import dmap_async
  165. >>> import operator
  166. >>> result = dmap_async(operator.add, [[2, 2], [4, 4], [8, 8]])
  167. >>> result.ready()
  168. False
  169. >>> time.sleep(1)
  170. >>> result.ready()
  171. True
  172. >>> result.result
  173. [4, 8, 16]
  174. * Refactored the task metadata cache and database backends, and added a new backend for Tokyo Tyrant. You can set the backend in your django settings file. e.g
  175. CELERY_BACKEND = "database"; # Uses the database
  176. CELERY_BACKEND = "cache"; # Uses the django cache framework
  177. CELERY_BACKEND = "tyrant"; # Uses Tokyo Tyrant
  178. TT_HOST = "localhost"; # Hostname for the Tokyo Tyrant server.
  179. TT_PORT = 6657; # Port of the Tokyo Tyrant server.
  180. 0.1.11 [2009-05-12 02:08 P.M CET]
  181. -------------------------------------------------
  182. * The logging system was leaking file descriptors, resulting in
  183. servers stopping with the EMFILES (too many open files) error. (fixed)
  184. 0.1.10 [2009-05-11 12:46 P.M CET]
  185. -------------------------------------------------
  186. * Tasks now supports both positional arguments and keyword arguments.
  187. * Requires carrot 0.3.8.
  188. * The daemon now tries to reconnect if the connection is lost.
  189. 0.1.8 [2009-05-07 12:27 P.M CET]
  190. ------------------------------------------------
  191. * Better test coverage
  192. * More documentation
  193. * celeryd doesn't emit ``Queue is empty`` message if
  194. ``settings.CELERYD_EMPTY_MSG_EMIT_EVERY`` is 0.
  195. 0.1.7 [2009-04-30 1:50 P.M CET]
  196. -----------------------------------------------
  197. * Added some unittests
  198. * Can now use the database for task metadata (like if the task has
  199. been executed or not). Set ``settings.CELERY_TASK_META``
  200. * Can now run ``python setup.py test`` to run the unittests from
  201. within the ``testproj`` project.
  202. * Can set the AMQP exchange/routing key/queue using
  203. ``settings.CELERY_AMQP_EXCHANGE``, ``settings.CELERY_AMQP_ROUTING_KEY``,
  204. and ``settings.CELERY_AMQP_CONSUMER_QUEUE``.
  205. 0.1.6 [2009-04-28 2:13 P.M CET]
  206. -----------------------------------------------
  207. * Introducing ``TaskSet``. A set of subtasks is executed and you can
  208. find out how many, or if all them, are done (excellent for progress bars and such)
  209. * Now catches all exceptions when running ``Task.__call__``, so the
  210. daemon doesn't die. This does't happen for pure functions yet, only
  211. ``Task`` classes.
  212. * ``autodiscover()`` now works with zipped eggs.
  213. * celeryd: Now adds curernt working directory to ``sys.path`` for
  214. convenience.
  215. * The ``run_every`` attribute of ``PeriodicTask`` classes can now be a
  216. ``datetime.timedelta()`` object.
  217. * celeryd: You can now set the ``DJANGO_PROJECT_DIR`` variable
  218. for ``celeryd`` and it will add that to ``sys.path`` for easy launching.
  219. * Can now check if a task has been executed or not via HTTP.
  220. You can do this by including the celery ``urls.py`` into your project,
  221. >>> url(r'^celery/$', include("celery.urls"))
  222. then visiting the following url,::
  223. http://mysite/celery/$task_id/done/
  224. this will return a JSON dictionary like e.g:
  225. >>> {"task": {"id": $task_id, "executed": true}}
  226. * ``delay_task`` now returns string id, not ``uuid.UUID`` instance.
  227. * Now has ``PeriodicTasks``, to have ``cron`` like functionality.
  228. * Project changed name from ``crunchy`` to ``celery``. The details of
  229. the name change request is in ``docs/name_change_request.txt``.
  230. 0.1.0 [2009-04-24 11:28 A.M CET]
  231. ------------------------------------------------
  232. * Initial release