IMPORTANT CHANGES
workers. So if you’ve had pool workers mysteriously dissapearing, or problems with celeryd stopping working, this has been fixed in this version.
Fixed a race condition with periodic tasks.
goes away or stops responding, it is automatically replaced with a new one.
"djangotwitter.tasks.UpdateStatusesTask". Very convenient. No idea why we didn’t do this before. Some documentation is updated to not manually specify a task name.
NEWS
Tested with Django 1.1
New Tutorial: Creating a click counter using carrot and celery
startup instead of for each check (which has been a forgotten TODO/XXX in the code for a long time)
Time (in seconds, or a datetime.timedelta object) for when after stored task results are deleted. For the moment this only works for the database backend.
has been launched.
periodic task status. (MySQL only so far, seeking patches for other engines)
DEBUG loglevel (--loglevel=DEBUG).
Functions/methods with a timeout argument now works correctly.
With an iterator yielding task args, kwargs tuples, evenly distribute the processing of its tasks throughout the time window available.
Log message Unknown task ignored... now has loglevel ERROR
the task has an ETA (estimated time of arrival). Also the message now includes the ETA for the task (if any).
target, as it’s not pickleable (can’t share AMQP connection, etc)).
Added note about .delay hanging in README
Tests now passing in Django 1.1
Fixed discovery to make sure app is in INSTALLED_APPS
available, etc.) is now handled by multiprocessing.Pool itself.
Convert statistics data to unicode for use as kwargs. Thanks Lucy!
New arguments to apply_async (the advanced version of delay_task), countdown and eta;
>>> # Run 10 seconds into the future. >>> res = apply_async(MyTask, countdown=10);>>> # Run 1 day from now >>> res = apply_async(MyTask, eta=datetime.now() + ... timedelta(days=1)
Now unlinks the pidfile if it’s stale.
Lots of more tests.
Now compatible with carrot >= 0.5.0.
IMPORTANT The subtask_ids attribute on the TaskSetResult instance has been removed. To get this information instead use:
>>> subtask_ids = [subtask.task_id for subtask in ts_res.subtasks]
Taskset.run() now respects extra message options from the task class.
Task: Add attribute ignore_result: Don’t store the status and return value. This means you can’t use the celery.result.AsyncResult to check if the task is done, or get its return value. Only use if you need the performance and is able live without these features. Any exceptions raised will store the return value/status as usual.
Task: Add attribute disable_error_emails to disable sending error emails for that task.
Should now work on Windows (although running in the background won’t work, so using the --detach argument results in an exception being raised.)
Added support for statistics for profiling and monitoring. To start sending statistics start celeryd with the --statistics option. Then after a while you can dump the results by running python manage.py celerystats. See celery.monitoring for more information.
The celery daemon can now be supervised (i.e it is automatically restarted if it crashes). To use this start celeryd with the --supervised option (or alternatively -S).
views.apply: View applying a task. Example:
http://e.com/celery/apply/task_name/arg1/arg2//?kwarg1=a&kwarg2=b
NOTE Use with caution, preferably not make this publicly accessible without ensuring your code is safe!
Refactored celery.task. It’s now split into three modules:
celery.task
Contains apply_async, delay_task, discard_all, and task shortcuts, plus imports objects from celery.task.base and celery.task.builtins
celery.task.base
Contains task base classes: Task, PeriodicTask, TaskSet, AsynchronousMapTask, ExecuteRemoteTask.
celery.task.builtins
Built-in tasks: PingTask, DeleteExpiredTaskMetaTask.
IMPORTANT Now uses AMQP’s basic.consume instead of basic.get. This means we’re no longer polling the broker for new messages.
IMPORTANT Default concurrency limit is now set to the number of CPUs available on the system.
IMPORTANT tasks.register: Renamed task_name argument to name, so
>>> tasks.register(func, task_name="mytask")
has to be replaced with:
>>> tasks.register(func, name="mytask")
The daemon now correctly runs if the pidlock is stale.
Now compatible with carrot 0.4.5
Default AMQP connnection timeout is now 4 seconds.
AsyncResult.read() was always returning True.
Only use README as long_description if the file exists so easy_install doesn’t break.
celery.view: JSON responses now properly set its mime-type.
apply_async now has a connection keyword argument so you can re-use the same AMQP connection if you want to execute more than one task.
Handle failures in task_status view such that it won’t throw 500s.
Fixed typo AMQP_SERVER in documentation to AMQP_HOST.
Worker exception e-mails sent to admins now works properly.
No longer depends on django, so installing celery won’t affect the preferred Django version installed.
Now works with PostgreSQL (psycopg2) again by registering the PickledObject field.
celeryd: Added --detach option as an alias to --daemon, and it’s the term used in the documentation from now on.
Make sure the pool and periodic task worker thread is terminated properly at exit. (So Ctrl-C works again).
Now depends on python-daemon.
Removed dependency to simplejson
Cache Backend: Re-establishes connection for every task process if the Django cache backend is memcached/libmemcached.
Tyrant Backend: Now re-establishes the connection for every task executed.
for periodic tasks to execute.
NOTE This is a development version, for the stable release, please see versions 0.2.x.
VERY IMPORTANT: Pickle is now the encoder used for serializing task arguments, so be sure to flush your task queue before you upgrade.
IMPORTANT TaskSet.run() now returns a celery.result.TaskSetResult instance, which lets you inspect the status and return values of a taskset as it was a single entity.
IMPORTANT Celery now depends on carrot >= 0.4.1.
The celery daemon now sends task errors to the registered admin e-mails. To turn off this feature, set SEND_CELERY_TASK_ERROR_EMAILS to False in your settings.py. Thanks to Grégoire Cachet.
You can now run the celery daemon by using manage.py:
$ python manage.py celeryd
Thanks to Grégoire Cachet.
Added support for message priorities, topic exchanges, custom routing keys for tasks. This means we have introduced celery.task.apply_async, a new way of executing tasks.
You can use celery.task.delay and celery.Task.delay like usual, but if you want greater control over the message sent, you want celery.task.apply_async and celery.Task.apply_async.
This also means the AMQP configuration has changed. Some settings has been renamed, while others are new:
CELERY_AMQP_EXCHANGE
CELERY_AMQP_PUBLISHER_ROUTING_KEY
CELERY_AMQP_CONSUMER_ROUTING_KEY
CELERY_AMQP_CONSUMER_QUEUE
CELERY_AMQP_EXCHANGE_TYPE
See the entry Can I send some tasks to only some servers? in the FAQ for more information.
Forgot to add yadayada to install requirements.
Now deletes all expired task results, not just those marked as done.
Able to load the Tokyo Tyrant backend class without django configuration, can specify tyrant settings directly in the class constructor.
Improved API documentation
Now using the Sphinx documentation system, you can build the html documentation by doing
$ cd docs
$ make html
and the result will be in docs/.build/html.
delay_task() etc. now returns celery.task.AsyncResult object, which lets you check the result and any failure that might have happened. It kind of works like the multiprocessing.AsyncResult class returned by multiprocessing.Pool.map_async.
Added dmap() and dmap_async(). This works like the multiprocessing.Pool versions except they are tasks distributed to the celery server. Example:
>>> from celery.task import dmap >>> import operator >>> dmap(operator.add, [[2, 2], [4, 4], [8, 8]]) >>> [4, 8, 16]>>> from celery.task import dmap_async >>> import operator >>> result = dmap_async(operator.add, [[2, 2], [4, 4], [8, 8]]) >>> result.ready() False >>> time.sleep(1) >>> result.ready() True >>> result.result [4, 8, 16]
Refactored the task metadata cache and database backends, and added a new backend for Tokyo Tyrant. You can set the backend in your django settings file. e.g:
CELERY_BACKEND = "database"; # Uses the database
CELERY_BACKEND = "cache"; # Uses the django cache framework
CELERY_BACKEND = "tyrant"; # Uses Tokyo Tyrant
TT_HOST = "localhost"; # Hostname for the Tokyo Tyrant server.
TT_PORT = 6657; # Port of the Tokyo Tyrant server.
Introducing TaskSet. A set of subtasks is executed and you can find out how many, or if all them, are done (excellent for progress bars and such)
Now catches all exceptions when running Task.__call__, so the daemon doesn’t die. This does’t happen for pure functions yet, only Task classes.
autodiscover() now works with zipped eggs.
celeryd: Now adds curernt working directory to sys.path for convenience.
The run_every attribute of PeriodicTask classes can now be a datetime.timedelta() object.
celeryd: You can now set the DJANGO_PROJECT_DIR variable for celeryd and it will add that to sys.path for easy launching.
Can now check if a task has been executed or not via HTTP.
You can do this by including the celery urls.py into your project,
>>> url(r'^celery/$', include("celery.urls"))
then visiting the following url,:
http://mysite/celery/$task_id/done/
this will return a JSON dictionary like e.g:
>>> {"task": {"id": $task_id, "executed": true}}
delay_task now returns string id, not uuid.UUID instance.
Now has PeriodicTasks, to have cron like functionality.
Project changed name from crunchy to celery. The details of the name change request is in docs/name_change_request.txt.