|
@@ -2,181 +2,187 @@
|
|
|
Change history
|
|
|
==============
|
|
|
|
|
|
+0.4.1 [2009-07-02 01:42 P.M CET]
|
|
|
+--------------------------------
|
|
|
+
|
|
|
+* Fixed a bug with parsing the message options (``mandatory``,
|
|
|
+ ``routing_key``, ``priority``, ``immediate``)
|
|
|
+
|
|
|
0.4.0 [2009-07-01 07:29 P.M CET]
|
|
|
------------------------------------------------
|
|
|
+--------------------------------
|
|
|
|
|
|
- * Adds eager execution. celery.execute.apply|Task.apply executes the
|
|
|
- function blocking until the task is done, for API compatiblity it
|
|
|
- returns an ``celery.result.EagerResult`` instance. You can configure
|
|
|
- celery to always run tasks locally by setting the
|
|
|
- ``CELERY_ALWAYS_EAGER`` setting to ``True``.
|
|
|
+* Adds eager execution. ``celery.execute.apply``|``Task.apply`` executes the
|
|
|
+ function blocking until the task is done, for API compatiblity it
|
|
|
+ returns an ``celery.result.EagerResult`` instance. You can configure
|
|
|
+ celery to always run tasks locally by setting the
|
|
|
+ ``CELERY_ALWAYS_EAGER`` setting to ``True``.
|
|
|
|
|
|
- * Now depends on ``anyjson``.
|
|
|
+* Now depends on ``anyjson``.
|
|
|
|
|
|
- * 99% coverage using python coverage 3.0.
|
|
|
+* 99% coverage using python ``coverage`` 3.0.
|
|
|
|
|
|
0.3.20 [2009-06-25 08:42 P.M CET]
|
|
|
------------------------------------------------
|
|
|
+---------------------------------
|
|
|
|
|
|
- * New arguments to ``apply_async`` (the advanced version of
|
|
|
- ``delay_task``), ``countdown`` and ``eta``;
|
|
|
+* New arguments to ``apply_async`` (the advanced version of
|
|
|
+ ``delay_task``), ``countdown`` and ``eta``;
|
|
|
|
|
|
- >>> # Run 10 seconds into the future.
|
|
|
- >>> res = apply_async(MyTask, countdown=10);
|
|
|
+ >>> # Run 10 seconds into the future.
|
|
|
+ >>> res = apply_async(MyTask, countdown=10);
|
|
|
|
|
|
- >>> # Run 1 day from now
|
|
|
- >>> res = apply_async(MyTask, eta=datetime.now() +
|
|
|
- ... timedelta(days=1)
|
|
|
+ >>> # Run 1 day from now
|
|
|
+ >>> res = apply_async(MyTask, eta=datetime.now() +
|
|
|
+ ... timedelta(days=1)
|
|
|
|
|
|
- * Now unlinks the pidfile if it's stale.
|
|
|
+* Now unlinks the pidfile if it's stale.
|
|
|
|
|
|
- * Lots of more tests.
|
|
|
+* Lots of more tests.
|
|
|
|
|
|
- * Now compatible with carrot >= 0.5.0.
|
|
|
+* Now compatible with carrot >= 0.5.0.
|
|
|
|
|
|
- * **IMPORTANT** The ``subtask_ids`` attribute on the ``TaskSetResult``
|
|
|
- instance has been removed. To get this information instead use:
|
|
|
+* **IMPORTANT** The ``subtask_ids`` attribute on the ``TaskSetResult``
|
|
|
+ instance has been removed. To get this information instead use:
|
|
|
|
|
|
- >>> subtask_ids = [subtask.task_id for subtask in ts_res.subtasks]
|
|
|
+ >>> subtask_ids = [subtask.task_id for subtask in ts_res.subtasks]
|
|
|
|
|
|
- * Taskset.run() now respects extra message options from the task class.
|
|
|
+* ``Taskset.run()`` now respects extra message options from the task class.
|
|
|
|
|
|
- * Task: Add attribute ``ignore_result``: Don't store the status and
|
|
|
- return value. This means you can't use the
|
|
|
- ``celery.result.AsyncResult`` to check if the task is
|
|
|
- done, or get its return value. Only use if you need the performance
|
|
|
- and is able live without these features. Any exceptions raised will
|
|
|
- store the return value/status as usual.
|
|
|
+* Task: Add attribute ``ignore_result``: Don't store the status and
|
|
|
+ return value. This means you can't use the
|
|
|
+ ``celery.result.AsyncResult`` to check if the task is
|
|
|
+ done, or get its return value. Only use if you need the performance
|
|
|
+ and is able live without these features. Any exceptions raised will
|
|
|
+ store the return value/status as usual.
|
|
|
|
|
|
- * Task: Add attribute ``disable_error_emails`` to disable sending error
|
|
|
- emails for that task.
|
|
|
+* Task: Add attribute ``disable_error_emails`` to disable sending error
|
|
|
+ emails for that task.
|
|
|
|
|
|
- * Should now work on Windows (although running in the background won't
|
|
|
- work, so using the ``--detach`` argument results in an exception
|
|
|
- being raised.)
|
|
|
+* Should now work on Windows (although running in the background won't
|
|
|
+ work, so using the ``--detach`` argument results in an exception
|
|
|
+ being raised.)
|
|
|
|
|
|
- * Added support for statistics for profiling and monitoring.
|
|
|
- To start sending statistics start ``celeryd`` with the
|
|
|
- ``--statistics`` option. Then after a while you can dump the results
|
|
|
- by running ``python manage.py celerystats``. See
|
|
|
- ``celery.monitoring`` for more information.
|
|
|
+* Added support for statistics for profiling and monitoring.
|
|
|
+ To start sending statistics start ``celeryd`` with the
|
|
|
+ ``--statistics`` option. Then after a while you can dump the results
|
|
|
+ by running ``python manage.py celerystats``. See
|
|
|
+ ``celery.monitoring`` for more information.
|
|
|
|
|
|
- * The celery daemon can now be supervised (i.e it is automatically
|
|
|
- restarted if it crashes). To use this start celeryd with the
|
|
|
- ``--supervised`` option (or alternatively ``-S``).
|
|
|
+* The celery daemon can now be supervised (i.e it is automatically
|
|
|
+ restarted if it crashes). To use this start celeryd with the
|
|
|
+ ``--supervised`` option (or alternatively ``-S``).
|
|
|
|
|
|
- * views.apply: View applying a task. Example::
|
|
|
+* views.apply: View applying a task. Example::
|
|
|
|
|
|
- http://e.com/celery/apply/task_name/arg1/arg2//?kwarg1=a&kwarg2=b
|
|
|
+ http://e.com/celery/apply/task_name/arg1/arg2//?kwarg1=a&kwarg2=b
|
|
|
|
|
|
- **NOTE** Use with caution, preferably not make this publicly
|
|
|
- accessible without ensuring your code is safe!
|
|
|
+ **NOTE** Use with caution, preferably not make this publicly
|
|
|
+ accessible without ensuring your code is safe!
|
|
|
|
|
|
- * Refactored celery.task. It's now split into three modules:
|
|
|
+* Refactored ``celery.task``. It's now split into three modules:
|
|
|
|
|
|
- * celery.task
|
|
|
+ * celery.task
|
|
|
|
|
|
- Contains apply_async, delay_task, discard_all, and task
|
|
|
- shortcuts, plus imports objects from celery.task.base and
|
|
|
- celery.task.builtins
|
|
|
+ Contains ``apply_async``, ``delay_task``, ``discard_all``, and task
|
|
|
+ shortcuts, plus imports objects from ``celery.task.base`` and
|
|
|
+ ``celery.task.builtins``
|
|
|
|
|
|
- * celery.task.base
|
|
|
+ * celery.task.base
|
|
|
|
|
|
- Contains task base classes: Task, PeriodicTask, TaskSet
|
|
|
+ Contains task base classes: ``Task``, ``PeriodicTask``,
|
|
|
+ ``TaskSet``, ``AsynchronousMapTask``, ``ExecuteRemoteTask``.
|
|
|
|
|
|
- * celery.task.builtins
|
|
|
+ * celery.task.builtins
|
|
|
|
|
|
- Built-in tasks: PingTask, AsynchronousMapTask,
|
|
|
- ExecuteRemoteTask, ++.
|
|
|
+ Built-in tasks: ``PingTask``, ``DeleteExpiredTaskMetaTask``.
|
|
|
|
|
|
|
|
|
0.3.7 [2008-06-16 11:41 P.M CET]
|
|
|
------------------------------------------------
|
|
|
+--------------------------------
|
|
|
|
|
|
- * **IMPORTANT** Now uses AMQP's ``basic.consume`` instead of
|
|
|
- ``basic.get``. This means we're no longer polling the broker for
|
|
|
- new messages.
|
|
|
+* **IMPORTANT** Now uses AMQP's ``basic.consume`` instead of
|
|
|
+ ``basic.get``. This means we're no longer polling the broker for
|
|
|
+ new messages.
|
|
|
|
|
|
- * **IMPORTANT** Default concurrency limit is now set to the number of CPUs
|
|
|
- available on the system.
|
|
|
+* **IMPORTANT** Default concurrency limit is now set to the number of CPUs
|
|
|
+ available on the system.
|
|
|
|
|
|
- * **IMPORTANT** ``tasks.register``: Renamed ``task_name`` argument to
|
|
|
- ``name``, so
|
|
|
+* **IMPORTANT** ``tasks.register``: Renamed ``task_name`` argument to
|
|
|
+ ``name``, so
|
|
|
|
|
|
- >>> tasks.register(func, task_name="mytask")
|
|
|
+ >>> tasks.register(func, task_name="mytask")
|
|
|
|
|
|
- has to be replaced with:
|
|
|
+ has to be replaced with:
|
|
|
|
|
|
- >>> tasks.register(func, name="mytask")
|
|
|
+ >>> tasks.register(func, name="mytask")
|
|
|
|
|
|
- * The daemon now correctly runs if the pidlock is stale.
|
|
|
+* The daemon now correctly runs if the pidlock is stale.
|
|
|
|
|
|
- * Now compatible with carrot 0.4.5
|
|
|
+* Now compatible with carrot 0.4.5
|
|
|
|
|
|
- * Default AMQP connnection timeout is now 4 seconds.
|
|
|
- * ``AsyncResult.read()`` was always returning ``True``.
|
|
|
+* Default AMQP connnection timeout is now 4 seconds.
|
|
|
+* ``AsyncResult.read()`` was always returning ``True``.
|
|
|
|
|
|
- * Only use README as long_description if the file exists so easy_install
|
|
|
- doesn't break.
|
|
|
+* Only use README as long_description if the file exists so easy_install
|
|
|
+ doesn't break.
|
|
|
|
|
|
- * ``celery.view``: JSON responses now properly set its mime-type.
|
|
|
+* ``celery.view``: JSON responses now properly set its mime-type.
|
|
|
|
|
|
- * ``apply_async`` now has a ``connection`` keyword argument so you
|
|
|
- can re-use the same AMQP connection if you want to execute
|
|
|
- more than one task.
|
|
|
+* ``apply_async`` now has a ``connection`` keyword argument so you
|
|
|
+ can re-use the same AMQP connection if you want to execute
|
|
|
+ more than one task.
|
|
|
|
|
|
- * Handle failures in task_status view such that it won't throw 500s.
|
|
|
+* Handle failures in task_status view such that it won't throw 500s.
|
|
|
|
|
|
- * Fixed typo ``AMQP_SERVER`` in documentation to ``AMQP_HOST``.
|
|
|
+* Fixed typo ``AMQP_SERVER`` in documentation to ``AMQP_HOST``.
|
|
|
|
|
|
- * Worker exception e-mails sent to admins now works properly.
|
|
|
+* Worker exception e-mails sent to admins now works properly.
|
|
|
|
|
|
- * No longer depends on ``django``, so installing ``celery`` won't affect
|
|
|
- the preferred Django version installed.
|
|
|
+* No longer depends on ``django``, so installing ``celery`` won't affect
|
|
|
+ the preferred Django version installed.
|
|
|
|
|
|
- * Now works with PostgreSQL (psycopg2) again by registering the
|
|
|
- ``PickledObject`` field.
|
|
|
+* Now works with PostgreSQL (psycopg2) again by registering the
|
|
|
+ ``PickledObject`` field.
|
|
|
|
|
|
- * ``celeryd``: Added ``--detach`` option as an alias to ``--daemon``, and
|
|
|
- it's the term used in the documentation from now on.
|
|
|
+* ``celeryd``: Added ``--detach`` option as an alias to ``--daemon``, and
|
|
|
+ it's the term used in the documentation from now on.
|
|
|
|
|
|
- * Make sure the pool and periodic task worker thread is terminated
|
|
|
- properly at exit. (So ``Ctrl-C`` works again).
|
|
|
+* Make sure the pool and periodic task worker thread is terminated
|
|
|
+ properly at exit. (So ``Ctrl-C`` works again).
|
|
|
|
|
|
- * Now depends on ``python-daemon``.
|
|
|
+* Now depends on ``python-daemon``.
|
|
|
|
|
|
- * Removed dependency to ``simplejson``
|
|
|
+* Removed dependency to ``simplejson``
|
|
|
|
|
|
- * Cache Backend: Re-establishes connection for every task process
|
|
|
- if the Django cache backend is memcached/libmemcached.
|
|
|
+* Cache Backend: Re-establishes connection for every task process
|
|
|
+ if the Django cache backend is memcached/libmemcached.
|
|
|
|
|
|
- * Tyrant Backend: Now re-establishes the connection for every task
|
|
|
- executed.
|
|
|
+* Tyrant Backend: Now re-establishes the connection for every task
|
|
|
+ executed.
|
|
|
|
|
|
0.3.3 [2009-06-08 01:07 P.M CET]
|
|
|
------------------------------------------------
|
|
|
+--------------------------------
|
|
|
|
|
|
* The ``PeriodicWorkController`` now sleeps for 1 second between checking
|
|
|
for periodic tasks to execute.
|
|
|
|
|
|
0.3.2 [2009-06-08 01:07 P.M CET]
|
|
|
------------------------------------------------
|
|
|
+--------------------------------
|
|
|
|
|
|
- * celeryd: Added option ``--discard``: Discard (delete!) all waiting
|
|
|
- messages in the queue.
|
|
|
+* celeryd: Added option ``--discard``: Discard (delete!) all waiting
|
|
|
+ messages in the queue.
|
|
|
|
|
|
- * celeryd: The ``--wakeup-after`` option was not handled as a float.
|
|
|
+* celeryd: The ``--wakeup-after`` option was not handled as a float.
|
|
|
|
|
|
0.3.1 [2009-06-08 01:07 P.M CET]
|
|
|
------------------------------------------------
|
|
|
+--------------------------------
|
|
|
|
|
|
- * The `PeriodicTask`` worker is now running in its own thread instead
|
|
|
- of blocking the ``TaskController`` loop.
|
|
|
+* The `PeriodicTask`` worker is now running in its own thread instead
|
|
|
+ of blocking the ``TaskController`` loop.
|
|
|
|
|
|
- * Default ``QUEUE_WAKEUP_AFTER`` has been lowered to ``0.1`` (was ``0.3``)
|
|
|
+* Default ``QUEUE_WAKEUP_AFTER`` has been lowered to ``0.1`` (was ``0.3``)
|
|
|
|
|
|
0.3.0 [2009-06-08 12:41 P.M CET]
|
|
|
------------------------------------------------
|
|
|
+--------------------------------
|
|
|
|
|
|
**NOTE** This is a development version, for the stable release, please
|
|
|
see versions 0.2.x.
|
|
@@ -184,257 +190,258 @@ see versions 0.2.x.
|
|
|
**VERY IMPORTANT:** Pickle is now the encoder used for serializing task
|
|
|
arguments, so be sure to flush your task queue before you upgrade.
|
|
|
|
|
|
- * **IMPORTANT** TaskSet.run() now returns a celery.result.TaskSetResult
|
|
|
- instance, which lets you inspect the status and return values of a
|
|
|
- taskset as it was a single entity.
|
|
|
+* **IMPORTANT** TaskSet.run() now returns a celery.result.TaskSetResult
|
|
|
+ instance, which lets you inspect the status and return values of a
|
|
|
+ taskset as it was a single entity.
|
|
|
|
|
|
- * **IMPORTANT** Celery now depends on carrot >= 0.4.1.
|
|
|
+* **IMPORTANT** Celery now depends on carrot >= 0.4.1.
|
|
|
|
|
|
- * The celery daemon now sends task errors to the registered admin e-mails.
|
|
|
- To turn off this feature, set ``SEND_CELERY_TASK_ERROR_EMAILS`` to
|
|
|
- ``False`` in your ``settings.py``. Thanks to Grégoire Cachet.
|
|
|
+* The celery daemon now sends task errors to the registered admin e-mails.
|
|
|
+ To turn off this feature, set ``SEND_CELERY_TASK_ERROR_EMAILS`` to
|
|
|
+ ``False`` in your ``settings.py``. Thanks to Grégoire Cachet.
|
|
|
|
|
|
- * You can now run the celery daemon by using ``manage.py``::
|
|
|
-
|
|
|
- $ python manage.py celeryd
|
|
|
+* You can now run the celery daemon by using ``manage.py``::
|
|
|
|
|
|
- Thanks to Grégoire Cachet.
|
|
|
+ $ python manage.py celeryd
|
|
|
|
|
|
- * Added support for message priorities, topic exchanges, custom routing
|
|
|
- keys for tasks. This means we have introduced
|
|
|
- ``celery.task.apply_async``, a new way of executing tasks.
|
|
|
+ Thanks to Grégoire Cachet.
|
|
|
|
|
|
- You can use ``celery.task.delay`` and ``celery.Task.delay`` like usual, but
|
|
|
- if you want greater control over the message sent, you want
|
|
|
- ``celery.task.apply_async`` and ``celery.Task.apply_async``.
|
|
|
+* Added support for message priorities, topic exchanges, custom routing
|
|
|
+ keys for tasks. This means we have introduced
|
|
|
+ ``celery.task.apply_async``, a new way of executing tasks.
|
|
|
|
|
|
- This also means the AMQP configuration has changed. Some settings has
|
|
|
- been renamed, while others are new::
|
|
|
+ You can use ``celery.task.delay`` and ``celery.Task.delay`` like usual, but
|
|
|
+ if you want greater control over the message sent, you want
|
|
|
+ ``celery.task.apply_async`` and ``celery.Task.apply_async``.
|
|
|
|
|
|
- CELERY_AMQP_EXCHANGE
|
|
|
- CELERY_AMQP_PUBLISHER_ROUTING_KEY
|
|
|
- CELERY_AMQP_CONSUMER_ROUTING_KEY
|
|
|
- CELERY_AMQP_CONSUMER_QUEUE
|
|
|
- CELERY_AMQP_EXCHANGE_TYPE
|
|
|
+ This also means the AMQP configuration has changed. Some settings has
|
|
|
+ been renamed, while others are new::
|
|
|
|
|
|
- See the entry `Can I send some tasks to only some servers?`_ in the
|
|
|
- `FAQ`_ for more information.
|
|
|
+ CELERY_AMQP_EXCHANGE
|
|
|
+ CELERY_AMQP_PUBLISHER_ROUTING_KEY
|
|
|
+ CELERY_AMQP_CONSUMER_ROUTING_KEY
|
|
|
+ CELERY_AMQP_CONSUMER_QUEUE
|
|
|
+ CELERY_AMQP_EXCHANGE_TYPE
|
|
|
+
|
|
|
+ See the entry `Can I send some tasks to only some servers?`_ in the
|
|
|
+ `FAQ`_ for more information.
|
|
|
|
|
|
.. _`Can I send some tasks to only some servers?`:
|
|
|
http://bit.ly/celery_AMQP_routing
|
|
|
.. _`FAQ`: http://ask.github.com/celery/faq.html
|
|
|
|
|
|
- * Task errors are now logged using loglevel ``ERROR`` instead of ``INFO``,
|
|
|
- and backtraces are dumped. Thanks to Grégoire Cachet.
|
|
|
+* Task errors are now logged using loglevel ``ERROR`` instead of ``INFO``,
|
|
|
+ and backtraces are dumped. Thanks to Grégoire Cachet.
|
|
|
|
|
|
- * Make every new worker process re-establish it's Django DB connection,
|
|
|
- this solving the "MySQL connection died?" exceptions.
|
|
|
- Thanks to Vitaly Babiy and Jirka Vejrazka.
|
|
|
+* Make every new worker process re-establish it's Django DB connection,
|
|
|
+ this solving the "MySQL connection died?" exceptions.
|
|
|
+ Thanks to Vitaly Babiy and Jirka Vejrazka.
|
|
|
|
|
|
- * **IMOPORTANT** Now using pickle to encode task arguments. This means you
|
|
|
- now can pass complex python objects to tasks as arguments.
|
|
|
+* **IMOPORTANT** Now using pickle to encode task arguments. This means you
|
|
|
+ now can pass complex python objects to tasks as arguments.
|
|
|
|
|
|
- * Removed dependency on ``yadayada``.
|
|
|
+* Removed dependency to ``yadayada``.
|
|
|
|
|
|
- * Added a FAQ, see ``docs/faq.rst``.
|
|
|
+* Added a FAQ, see ``docs/faq.rst``.
|
|
|
|
|
|
- * Now converts any unicode keys in task ``kwargs`` to regular strings.
|
|
|
- Thanks Vitaly Babiy.
|
|
|
+* Now converts any unicode keys in task ``kwargs`` to regular strings.
|
|
|
+ Thanks Vitaly Babiy.
|
|
|
|
|
|
- * Renamed the ``TaskDaemon`` to ``WorkController``.
|
|
|
+* Renamed the ``TaskDaemon`` to ``WorkController``.
|
|
|
|
|
|
- * ``celery.datastructures.TaskProcessQueue`` is now renamed to
|
|
|
- ``celery.pool.TaskPool``.
|
|
|
+* ``celery.datastructures.TaskProcessQueue`` is now renamed to
|
|
|
+ ``celery.pool.TaskPool``.
|
|
|
|
|
|
- * The pool algorithm has been refactored for greater performance and
|
|
|
- stability.
|
|
|
+* The pool algorithm has been refactored for greater performance and
|
|
|
+ stability.
|
|
|
|
|
|
0.2.0 [2009-05-20 05:14 P.M CET]
|
|
|
-------------------------------------------------
|
|
|
+--------------------------------
|
|
|
|
|
|
- * Final release of 0.2.0
|
|
|
+* Final release of 0.2.0
|
|
|
|
|
|
- * Compatible with carrot version 0.4.0.
|
|
|
+* Compatible with carrot version 0.4.0.
|
|
|
|
|
|
- * Fixes some syntax errors related to fetching results
|
|
|
- from the database backend.
|
|
|
+* Fixes some syntax errors related to fetching results
|
|
|
+ from the database backend.
|
|
|
|
|
|
0.2.0-pre3 [2009-05-20 05:14 P.M CET]
|
|
|
-----------------------------------------------------
|
|
|
+-------------------------------------
|
|
|
|
|
|
- * *Internal release*. Improved handling of unpickled exceptions,
|
|
|
- get_result() now tries to recreate something looking like the
|
|
|
- original exception.
|
|
|
+* *Internal release*. Improved handling of unpickled exceptions,
|
|
|
+ ``get_result`` now tries to recreate something looking like the
|
|
|
+ original exception.
|
|
|
|
|
|
0.2.0-pre2 [2009-05-20 01:56 P.M CET]
|
|
|
-----------------------------------------------------
|
|
|
+-------------------------------------
|
|
|
|
|
|
- * Now handles unpickleable exceptions (like the dynimically generated
|
|
|
- subclasses of ``django.core.exception.MultipleObjectsReturned``).
|
|
|
+* Now handles unpickleable exceptions (like the dynimically generated
|
|
|
+ subclasses of ``django.core.exception.MultipleObjectsReturned``).
|
|
|
|
|
|
0.2.0-pre1 [2009-05-20 12:33 P.M CET]
|
|
|
-----------------------------------------------------
|
|
|
+-------------------------------------
|
|
|
|
|
|
- * It's getting quite stable, with a lot of new features, so bump
|
|
|
- version to 0.2. This is a pre-release.
|
|
|
+* It's getting quite stable, with a lot of new features, so bump
|
|
|
+ version to 0.2. This is a pre-release.
|
|
|
|
|
|
- * ``celery.task.mark_as_read()`` and ``celery.task.mark_as_failure()`` has
|
|
|
- been removed. Use ``celery.backends.default_backend.mark_as_read()``,
|
|
|
- and ``celery.backends.default_backend.mark_as_failure()`` instead.
|
|
|
+* ``celery.task.mark_as_read()`` and ``celery.task.mark_as_failure()`` has
|
|
|
+ been removed. Use ``celery.backends.default_backend.mark_as_read()``,
|
|
|
+ and ``celery.backends.default_backend.mark_as_failure()`` instead.
|
|
|
|
|
|
0.1.15 [2009-05-19 04:13 P.M CET]
|
|
|
-------------------------------------------------
|
|
|
+---------------------------------
|
|
|
|
|
|
- * The celery daemon was leaking AMQP connections, this should be fixed,
|
|
|
- if you have any problems with too many files open (like ``emfile``
|
|
|
- errors in ``rabbit.log``, please contact us!
|
|
|
+* The celery daemon was leaking AMQP connections, this should be fixed,
|
|
|
+ if you have any problems with too many files open (like ``emfile``
|
|
|
+ errors in ``rabbit.log``, please contact us!
|
|
|
|
|
|
0.1.14 [2009-05-19 01:08 P.M CET]
|
|
|
-------------------------------------------------
|
|
|
+---------------------------------
|
|
|
|
|
|
- * Fixed a syntax error in the ``TaskSet`` class. (No such variable
|
|
|
- ``TimeOutError``).
|
|
|
+* Fixed a syntax error in the ``TaskSet`` class. (No such variable
|
|
|
+ ``TimeOutError``).
|
|
|
|
|
|
0.1.13 [2009-05-19 12:36 P.M CET]
|
|
|
-------------------------------------------------
|
|
|
+---------------------------------
|
|
|
+
|
|
|
+* Forgot to add ``yadayada`` to install requirements.
|
|
|
|
|
|
- * Forgot to add ``yadayada`` to install requirements.
|
|
|
+* Now deletes all expired task results, not just those marked as done.
|
|
|
|
|
|
- * Now deletes all expired task results, not just those marked as done.
|
|
|
+* Able to load the Tokyo Tyrant backend class without django
|
|
|
+ configuration, can specify tyrant settings directly in the class
|
|
|
+ constructor.
|
|
|
|
|
|
- * Able to load the Tokyo Tyrant backend class without django
|
|
|
- configuration, can specify tyrant settings directly in the class
|
|
|
- constructor.
|
|
|
+* Improved API documentation
|
|
|
|
|
|
- * Improved API documentation
|
|
|
+* Now using the Sphinx documentation system, you can build
|
|
|
+ the html documentation by doing ::
|
|
|
|
|
|
- * Now using the Sphinx documentation system, you can build
|
|
|
- the html documentation by doing ::
|
|
|
+ $ cd docs
|
|
|
+ $ make html
|
|
|
|
|
|
- $ cd docs
|
|
|
- $ make html
|
|
|
-
|
|
|
- and the result will be in ``docs/.build/html``.
|
|
|
+ and the result will be in ``docs/.build/html``.
|
|
|
|
|
|
0.1.12 [2009-05-18 04:38 P.M CET]
|
|
|
-------------------------------------------------
|
|
|
+---------------------------------
|
|
|
|
|
|
- * ``delay_task()`` etc. now returns ``celery.task.AsyncResult`` object,
|
|
|
- which lets you check the result and any failure that might have
|
|
|
- happened. It kind of works like the ``multiprocessing.AsyncResult``
|
|
|
- class returned by ``multiprocessing.Pool.map_async``.
|
|
|
+* ``delay_task()`` etc. now returns ``celery.task.AsyncResult`` object,
|
|
|
+ which lets you check the result and any failure that might have
|
|
|
+ happened. It kind of works like the ``multiprocessing.AsyncResult``
|
|
|
+ class returned by ``multiprocessing.Pool.map_async``.
|
|
|
|
|
|
- * Added dmap() and dmap_async(). This works like the
|
|
|
- ``multiprocessing.Pool`` versions except they are tasks
|
|
|
- distributed to the celery server. Example:
|
|
|
+* Added dmap() and dmap_async(). This works like the
|
|
|
+ ``multiprocessing.Pool`` versions except they are tasks
|
|
|
+ distributed to the celery server. Example:
|
|
|
|
|
|
- >>> from celery.task import dmap
|
|
|
- >>> import operator
|
|
|
- >>> dmap(operator.add, [[2, 2], [4, 4], [8, 8]])
|
|
|
- >>> [4, 8, 16]
|
|
|
+ >>> from celery.task import dmap
|
|
|
+ >>> import operator
|
|
|
+ >>> dmap(operator.add, [[2, 2], [4, 4], [8, 8]])
|
|
|
+ >>> [4, 8, 16]
|
|
|
|
|
|
- >>> from celery.task import dmap_async
|
|
|
- >>> import operator
|
|
|
- >>> result = dmap_async(operator.add, [[2, 2], [4, 4], [8, 8]])
|
|
|
- >>> result.ready()
|
|
|
- False
|
|
|
- >>> time.sleep(1)
|
|
|
- >>> result.ready()
|
|
|
- True
|
|
|
- >>> result.result
|
|
|
- [4, 8, 16]
|
|
|
-
|
|
|
- * Refactored the task metadata cache and database backends, and added a new backend for Tokyo Tyrant. You can set the backend in your django settings file. e.g
|
|
|
-
|
|
|
- CELERY_BACKEND = "database"; # Uses the database
|
|
|
-
|
|
|
- CELERY_BACKEND = "cache"; # Uses the django cache framework
|
|
|
-
|
|
|
- CELERY_BACKEND = "tyrant"; # Uses Tokyo Tyrant
|
|
|
- TT_HOST = "localhost"; # Hostname for the Tokyo Tyrant server.
|
|
|
- TT_PORT = 6657; # Port of the Tokyo Tyrant server.
|
|
|
+ >>> from celery.task import dmap_async
|
|
|
+ >>> import operator
|
|
|
+ >>> result = dmap_async(operator.add, [[2, 2], [4, 4], [8, 8]])
|
|
|
+ >>> result.ready()
|
|
|
+ False
|
|
|
+ >>> time.sleep(1)
|
|
|
+ >>> result.ready()
|
|
|
+ True
|
|
|
+ >>> result.result
|
|
|
+ [4, 8, 16]
|
|
|
+
|
|
|
+* Refactored the task metadata cache and database backends, and added
|
|
|
+ a new backend for Tokyo Tyrant. You can set the backend in your django
|
|
|
+ settings file. e.g::
|
|
|
+
|
|
|
+ CELERY_BACKEND = "database"; # Uses the database
|
|
|
+ CELERY_BACKEND = "cache"; # Uses the django cache framework
|
|
|
+ CELERY_BACKEND = "tyrant"; # Uses Tokyo Tyrant
|
|
|
+ TT_HOST = "localhost"; # Hostname for the Tokyo Tyrant server.
|
|
|
+ TT_PORT = 6657; # Port of the Tokyo Tyrant server.
|
|
|
|
|
|
0.1.11 [2009-05-12 02:08 P.M CET]
|
|
|
--------------------------------------------------
|
|
|
+---------------------------------
|
|
|
|
|
|
- * The logging system was leaking file descriptors, resulting in
|
|
|
- servers stopping with the EMFILES (too many open files) error. (fixed)
|
|
|
+* The logging system was leaking file descriptors, resulting in
|
|
|
+ servers stopping with the EMFILES (too many open files) error. (fixed)
|
|
|
|
|
|
0.1.10 [2009-05-11 12:46 P.M CET]
|
|
|
--------------------------------------------------
|
|
|
+---------------------------------
|
|
|
|
|
|
- * Tasks now supports both positional arguments and keyword arguments.
|
|
|
+* Tasks now supports both positional arguments and keyword arguments.
|
|
|
|
|
|
- * Requires carrot 0.3.8.
|
|
|
+* Requires carrot 0.3.8.
|
|
|
|
|
|
- * The daemon now tries to reconnect if the connection is lost.
|
|
|
+* The daemon now tries to reconnect if the connection is lost.
|
|
|
|
|
|
0.1.8 [2009-05-07 12:27 P.M CET]
|
|
|
-------------------------------------------------
|
|
|
+--------------------------------
|
|
|
|
|
|
- * Better test coverage
|
|
|
- * More documentation
|
|
|
- * celeryd doesn't emit ``Queue is empty`` message if
|
|
|
- ``settings.CELERYD_EMPTY_MSG_EMIT_EVERY`` is 0.
|
|
|
+* Better test coverage
|
|
|
+* More documentation
|
|
|
+* celeryd doesn't emit ``Queue is empty`` message if
|
|
|
+ ``settings.CELERYD_EMPTY_MSG_EMIT_EVERY`` is 0.
|
|
|
|
|
|
0.1.7 [2009-04-30 1:50 P.M CET]
|
|
|
------------------------------------------------
|
|
|
+-------------------------------
|
|
|
|
|
|
- * Added some unittests
|
|
|
+* Added some unittests
|
|
|
|
|
|
- * Can now use the database for task metadata (like if the task has
|
|
|
- been executed or not). Set ``settings.CELERY_TASK_META``
|
|
|
+* Can now use the database for task metadata (like if the task has
|
|
|
+ been executed or not). Set ``settings.CELERY_TASK_META``
|
|
|
|
|
|
- * Can now run ``python setup.py test`` to run the unittests from
|
|
|
- within the ``testproj`` project.
|
|
|
+* Can now run ``python setup.py test`` to run the unittests from
|
|
|
+ within the ``testproj`` project.
|
|
|
|
|
|
- * Can set the AMQP exchange/routing key/queue using
|
|
|
- ``settings.CELERY_AMQP_EXCHANGE``, ``settings.CELERY_AMQP_ROUTING_KEY``,
|
|
|
- and ``settings.CELERY_AMQP_CONSUMER_QUEUE``.
|
|
|
+* Can set the AMQP exchange/routing key/queue using
|
|
|
+ ``settings.CELERY_AMQP_EXCHANGE``, ``settings.CELERY_AMQP_ROUTING_KEY``,
|
|
|
+ and ``settings.CELERY_AMQP_CONSUMER_QUEUE``.
|
|
|
|
|
|
0.1.6 [2009-04-28 2:13 P.M CET]
|
|
|
------------------------------------------------
|
|
|
+-------------------------------
|
|
|
|
|
|
- * Introducing ``TaskSet``. A set of subtasks is executed and you can
|
|
|
- find out how many, or if all them, are done (excellent for progress bars and such)
|
|
|
+* Introducing ``TaskSet``. A set of subtasks is executed and you can
|
|
|
+ find out how many, or if all them, are done (excellent for progress
|
|
|
+ bars and such)
|
|
|
|
|
|
- * Now catches all exceptions when running ``Task.__call__``, so the
|
|
|
- daemon doesn't die. This does't happen for pure functions yet, only
|
|
|
- ``Task`` classes.
|
|
|
+* Now catches all exceptions when running ``Task.__call__``, so the
|
|
|
+ daemon doesn't die. This does't happen for pure functions yet, only
|
|
|
+ ``Task`` classes.
|
|
|
|
|
|
- * ``autodiscover()`` now works with zipped eggs.
|
|
|
+* ``autodiscover()`` now works with zipped eggs.
|
|
|
|
|
|
- * celeryd: Now adds curernt working directory to ``sys.path`` for
|
|
|
- convenience.
|
|
|
+* celeryd: Now adds curernt working directory to ``sys.path`` for
|
|
|
+ convenience.
|
|
|
|
|
|
- * The ``run_every`` attribute of ``PeriodicTask`` classes can now be a
|
|
|
- ``datetime.timedelta()`` object.
|
|
|
+* The ``run_every`` attribute of ``PeriodicTask`` classes can now be a
|
|
|
+ ``datetime.timedelta()`` object.
|
|
|
|
|
|
- * celeryd: You can now set the ``DJANGO_PROJECT_DIR`` variable
|
|
|
- for ``celeryd`` and it will add that to ``sys.path`` for easy launching.
|
|
|
+* celeryd: You can now set the ``DJANGO_PROJECT_DIR`` variable
|
|
|
+ for ``celeryd`` and it will add that to ``sys.path`` for easy launching.
|
|
|
|
|
|
- * Can now check if a task has been executed or not via HTTP.
|
|
|
+* Can now check if a task has been executed or not via HTTP.
|
|
|
|
|
|
- You can do this by including the celery ``urls.py`` into your project,
|
|
|
+* You can do this by including the celery ``urls.py`` into your project,
|
|
|
|
|
|
>>> url(r'^celery/$', include("celery.urls"))
|
|
|
|
|
|
- then visiting the following url,::
|
|
|
+ then visiting the following url,::
|
|
|
|
|
|
http://mysite/celery/$task_id/done/
|
|
|
|
|
|
- this will return a JSON dictionary like e.g:
|
|
|
+ this will return a JSON dictionary like e.g:
|
|
|
|
|
|
>>> {"task": {"id": $task_id, "executed": true}}
|
|
|
|
|
|
- * ``delay_task`` now returns string id, not ``uuid.UUID`` instance.
|
|
|
+* ``delay_task`` now returns string id, not ``uuid.UUID`` instance.
|
|
|
|
|
|
- * Now has ``PeriodicTasks``, to have ``cron`` like functionality.
|
|
|
+* Now has ``PeriodicTasks``, to have ``cron`` like functionality.
|
|
|
|
|
|
- * Project changed name from ``crunchy`` to ``celery``. The details of
|
|
|
- the name change request is in ``docs/name_change_request.txt``.
|
|
|
+* Project changed name from ``crunchy`` to ``celery``. The details of
|
|
|
+ the name change request is in ``docs/name_change_request.txt``.
|
|
|
|
|
|
0.1.0 [2009-04-24 11:28 A.M CET]
|
|
|
-------------------------------------------------
|
|
|
+--------------------------------
|
|
|
|
|
|
- * Initial release
|
|
|
+* Initial release
|