Change history

0.6.0 [2009-08-07 06:54 A.M CET]

IMPORTANT CHANGES

  • Fixed a bug where tasks raising unpickleable exceptions crashed pool

    workers. So if you’ve had pool workers mysteriously dissapearing, or problems with celeryd stopping working, this has been fixed in this version.

  • Fixed a race condition with periodic tasks.

  • The task pool is now supervised, so if a pool worker crashes,

    goes away or stops responding, it is automatically replaced with a new one.

  • Task.name is now automatically generated out of class module+name, e.g.

    "djangotwitter.tasks.UpdateStatusesTask". Very convenient. No idea why we didn’t do this before. Some documentation is updated to not manually specify a task name.

NEWS

  • Tested with Django 1.1

  • New Tutorial: Creating a click counter using carrot and celery

  • Database entries for periodic tasks are now created at celeryd

    startup instead of for each check (which has been a forgotten TODO/XXX in the code for a long time)

  • New settings variable: CELERY_TASK_RESULT_EXPIRES

    Time (in seconds, or a datetime.timedelta object) for when after stored task results are deleted. For the moment this only works for the database backend.

  • celeryd now emits a debug log message for which periodic tasks

    has been launched.

  • The periodic task table is now locked for reading while getting

    periodic task status. (MySQL only so far, seeking patches for other engines)

  • A lot more debugging information is now available by turning on the

    DEBUG loglevel (--loglevel=DEBUG).

  • Functions/methods with a timeout argument now works correctly.

  • New: celery.strategy.even_time_distribution:

    With an iterator yielding task args, kwargs tuples, evenly distribute the processing of its tasks throughout the time window available.

  • Log message Unknown task ignored... now has loglevel ERROR

  • Log message "Got task from broker" is now emitted for all tasks, even if

    the task has an ETA (estimated time of arrival). Also the message now includes the ETA for the task (if any).

  • Acknowledgement now happens in the pool callback. Can’t do ack in the job

    target, as it’s not pickleable (can’t share AMQP connection, etc)).

  • Added note about .delay hanging in README

  • Tests now passing in Django 1.1

  • Fixed discovery to make sure app is in INSTALLED_APPS

  • Previously overrided pool behaviour (process reap, wait until pool worker

    available, etc.) is now handled by multiprocessing.Pool itself.

  • Convert statistics data to unicode for use as kwargs. Thanks Lucy!

0.4.1 [2009-07-02 01:42 P.M CET]

  • Fixed a bug with parsing the message options (mandatory, routing_key, priority, immediate)

0.4.0 [2009-07-01 07:29 P.M CET]

  • Adds eager execution. celery.execute.apply``|``Task.apply executes the function blocking until the task is done, for API compatiblity it returns an celery.result.EagerResult instance. You can configure celery to always run tasks locally by setting the CELERY_ALWAYS_EAGER setting to True.
  • Now depends on anyjson.
  • 99% coverage using python coverage 3.0.

0.3.20 [2009-06-25 08:42 P.M CET]

  • New arguments to apply_async (the advanced version of delay_task), countdown and eta;

    >>> # Run 10 seconds into the future.
    >>> res = apply_async(MyTask, countdown=10);
    
    >>> # Run 1 day from now
    >>> res = apply_async(MyTask, eta=datetime.now() +
    ...                                                                     timedelta(days=1)
    
  • Now unlinks the pidfile if it’s stale.

  • Lots of more tests.

  • Now compatible with carrot >= 0.5.0.

  • IMPORTANT The subtask_ids attribute on the TaskSetResult instance has been removed. To get this information instead use:

    >>> subtask_ids = [subtask.task_id for subtask in ts_res.subtasks]
    
  • Taskset.run() now respects extra message options from the task class.

  • Task: Add attribute ignore_result: Don’t store the status and return value. This means you can’t use the celery.result.AsyncResult to check if the task is done, or get its return value. Only use if you need the performance and is able live without these features. Any exceptions raised will store the return value/status as usual.

  • Task: Add attribute disable_error_emails to disable sending error emails for that task.

  • Should now work on Windows (although running in the background won’t work, so using the --detach argument results in an exception being raised.)

  • Added support for statistics for profiling and monitoring. To start sending statistics start celeryd with the --statistics option. Then after a while you can dump the results by running python manage.py celerystats. See celery.monitoring for more information.

  • The celery daemon can now be supervised (i.e it is automatically restarted if it crashes). To use this start celeryd with the --supervised option (or alternatively -S).

  • views.apply: View applying a task. Example:

    http://e.com/celery/apply/task_name/arg1/arg2//?kwarg1=a&kwarg2=b

    NOTE Use with caution, preferably not make this publicly accessible without ensuring your code is safe!

  • Refactored celery.task. It’s now split into three modules:

    • celery.task

      Contains apply_async, delay_task, discard_all, and task shortcuts, plus imports objects from celery.task.base and celery.task.builtins

    • celery.task.base

      Contains task base classes: Task, PeriodicTask, TaskSet, AsynchronousMapTask, ExecuteRemoteTask.

    • celery.task.builtins

      Built-in tasks: PingTask, DeleteExpiredTaskMetaTask.

0.3.7 [2008-06-16 11:41 P.M CET]

  • IMPORTANT Now uses AMQP’s basic.consume instead of basic.get. This means we’re no longer polling the broker for new messages.

  • IMPORTANT Default concurrency limit is now set to the number of CPUs available on the system.

  • IMPORTANT tasks.register: Renamed task_name argument to name, so

    >>> tasks.register(func, task_name="mytask")
    

    has to be replaced with:

    >>> tasks.register(func, name="mytask")
    
  • The daemon now correctly runs if the pidlock is stale.

  • Now compatible with carrot 0.4.5

  • Default AMQP connnection timeout is now 4 seconds.

  • AsyncResult.read() was always returning True.

  • Only use README as long_description if the file exists so easy_install doesn’t break.

  • celery.view: JSON responses now properly set its mime-type.

  • apply_async now has a connection keyword argument so you can re-use the same AMQP connection if you want to execute more than one task.

  • Handle failures in task_status view such that it won’t throw 500s.

  • Fixed typo AMQP_SERVER in documentation to AMQP_HOST.

  • Worker exception e-mails sent to admins now works properly.

  • No longer depends on django, so installing celery won’t affect the preferred Django version installed.

  • Now works with PostgreSQL (psycopg2) again by registering the PickledObject field.

  • celeryd: Added --detach option as an alias to --daemon, and it’s the term used in the documentation from now on.

  • Make sure the pool and periodic task worker thread is terminated properly at exit. (So Ctrl-C works again).

  • Now depends on python-daemon.

  • Removed dependency to simplejson

  • Cache Backend: Re-establishes connection for every task process if the Django cache backend is memcached/libmemcached.

  • Tyrant Backend: Now re-establishes the connection for every task executed.

0.3.3 [2009-06-08 01:07 P.M CET]

  • The PeriodicWorkController now sleeps for 1 second between checking

    for periodic tasks to execute.

0.3.2 [2009-06-08 01:07 P.M CET]

  • celeryd: Added option --discard: Discard (delete!) all waiting messages in the queue.
  • celeryd: The --wakeup-after option was not handled as a float.

0.3.1 [2009-06-08 01:07 P.M CET]

  • The PeriodicTask` worker is now running in its own thread instead of blocking the TaskController loop.
  • Default QUEUE_WAKEUP_AFTER has been lowered to 0.1 (was 0.3)

0.3.0 [2009-06-08 12:41 P.M CET]

NOTE This is a development version, for the stable release, please see versions 0.2.x.

VERY IMPORTANT: Pickle is now the encoder used for serializing task arguments, so be sure to flush your task queue before you upgrade.

  • IMPORTANT TaskSet.run() now returns a celery.result.TaskSetResult instance, which lets you inspect the status and return values of a taskset as it was a single entity.

  • IMPORTANT Celery now depends on carrot >= 0.4.1.

  • The celery daemon now sends task errors to the registered admin e-mails. To turn off this feature, set SEND_CELERY_TASK_ERROR_EMAILS to False in your settings.py. Thanks to Grégoire Cachet.

  • You can now run the celery daemon by using manage.py:

    $ python manage.py celeryd

    Thanks to Grégoire Cachet.

  • Added support for message priorities, topic exchanges, custom routing keys for tasks. This means we have introduced celery.task.apply_async, a new way of executing tasks.

    You can use celery.task.delay and celery.Task.delay like usual, but if you want greater control over the message sent, you want celery.task.apply_async and celery.Task.apply_async.

    This also means the AMQP configuration has changed. Some settings has been renamed, while others are new:

    CELERY_AMQP_EXCHANGE
    CELERY_AMQP_PUBLISHER_ROUTING_KEY
    CELERY_AMQP_CONSUMER_ROUTING_KEY
    CELERY_AMQP_CONSUMER_QUEUE
    CELERY_AMQP_EXCHANGE_TYPE
    

    See the entry Can I send some tasks to only some servers? in the FAQ for more information.

  • Task errors are now logged using loglevel ERROR instead of INFO, and backtraces are dumped. Thanks to Grégoire Cachet.
  • Make every new worker process re-establish it’s Django DB connection, this solving the “MySQL connection died?” exceptions. Thanks to Vitaly Babiy and Jirka Vejrazka.
  • IMOPORTANT Now using pickle to encode task arguments. This means you now can pass complex python objects to tasks as arguments.
  • Removed dependency to yadayada.
  • Added a FAQ, see docs/faq.rst.
  • Now converts any unicode keys in task kwargs to regular strings. Thanks Vitaly Babiy.
  • Renamed the TaskDaemon to WorkController.
  • celery.datastructures.TaskProcessQueue is now renamed to celery.pool.TaskPool.
  • The pool algorithm has been refactored for greater performance and stability.

0.2.0 [2009-05-20 05:14 P.M CET]

  • Final release of 0.2.0
  • Compatible with carrot version 0.4.0.
  • Fixes some syntax errors related to fetching results from the database backend.

0.2.0-pre3 [2009-05-20 05:14 P.M CET]

  • Internal release. Improved handling of unpickled exceptions, get_result now tries to recreate something looking like the original exception.

0.2.0-pre2 [2009-05-20 01:56 P.M CET]

  • Now handles unpickleable exceptions (like the dynimically generated subclasses of django.core.exception.MultipleObjectsReturned).

0.2.0-pre1 [2009-05-20 12:33 P.M CET]

  • It’s getting quite stable, with a lot of new features, so bump version to 0.2. This is a pre-release.
  • celery.task.mark_as_read() and celery.task.mark_as_failure() has been removed. Use celery.backends.default_backend.mark_as_read(), and celery.backends.default_backend.mark_as_failure() instead.

0.1.15 [2009-05-19 04:13 P.M CET]

  • The celery daemon was leaking AMQP connections, this should be fixed, if you have any problems with too many files open (like emfile errors in rabbit.log, please contact us!

0.1.14 [2009-05-19 01:08 P.M CET]

  • Fixed a syntax error in the TaskSet class. (No such variable TimeOutError).

0.1.13 [2009-05-19 12:36 P.M CET]

  • Forgot to add yadayada to install requirements.

  • Now deletes all expired task results, not just those marked as done.

  • Able to load the Tokyo Tyrant backend class without django configuration, can specify tyrant settings directly in the class constructor.

  • Improved API documentation

  • Now using the Sphinx documentation system, you can build the html documentation by doing

    $ cd docs
    $ make html

    and the result will be in docs/.build/html.

0.1.12 [2009-05-18 04:38 P.M CET]

  • delay_task() etc. now returns celery.task.AsyncResult object, which lets you check the result and any failure that might have happened. It kind of works like the multiprocessing.AsyncResult class returned by multiprocessing.Pool.map_async.

  • Added dmap() and dmap_async(). This works like the multiprocessing.Pool versions except they are tasks distributed to the celery server. Example:

    >>> from celery.task import dmap
    >>> import operator
    >>> dmap(operator.add, [[2, 2], [4, 4], [8, 8]])
    >>> [4, 8, 16]
    
    >>> from celery.task import dmap_async
    >>> import operator
    >>> result = dmap_async(operator.add, [[2, 2], [4, 4], [8, 8]])
    >>> result.ready()
    False
    >>> time.sleep(1)
    >>> result.ready()
    True
    >>> result.result
    [4, 8, 16]
    
  • Refactored the task metadata cache and database backends, and added a new backend for Tokyo Tyrant. You can set the backend in your django settings file. e.g:

    CELERY_BACKEND = "database"; # Uses the database
    CELERY_BACKEND = "cache"; # Uses the django cache framework
    CELERY_BACKEND = "tyrant"; # Uses Tokyo Tyrant
    TT_HOST = "localhost"; # Hostname for the Tokyo Tyrant server.
    TT_PORT = 6657; # Port of the Tokyo Tyrant server.
    

0.1.11 [2009-05-12 02:08 P.M CET]

  • The logging system was leaking file descriptors, resulting in servers stopping with the EMFILES (too many open files) error. (fixed)

0.1.10 [2009-05-11 12:46 P.M CET]

  • Tasks now supports both positional arguments and keyword arguments.
  • Requires carrot 0.3.8.
  • The daemon now tries to reconnect if the connection is lost.

0.1.8 [2009-05-07 12:27 P.M CET]

  • Better test coverage
  • More documentation
  • celeryd doesn’t emit Queue is empty message if settings.CELERYD_EMPTY_MSG_EMIT_EVERY is 0.

0.1.7 [2009-04-30 1:50 P.M CET]

  • Added some unittests
  • Can now use the database for task metadata (like if the task has been executed or not). Set settings.CELERY_TASK_META
  • Can now run python setup.py test to run the unittests from within the testproj project.
  • Can set the AMQP exchange/routing key/queue using settings.CELERY_AMQP_EXCHANGE, settings.CELERY_AMQP_ROUTING_KEY, and settings.CELERY_AMQP_CONSUMER_QUEUE.

0.1.6 [2009-04-28 2:13 P.M CET]

  • Introducing TaskSet. A set of subtasks is executed and you can find out how many, or if all them, are done (excellent for progress bars and such)

  • Now catches all exceptions when running Task.__call__, so the daemon doesn’t die. This does’t happen for pure functions yet, only Task classes.

  • autodiscover() now works with zipped eggs.

  • celeryd: Now adds curernt working directory to sys.path for convenience.

  • The run_every attribute of PeriodicTask classes can now be a datetime.timedelta() object.

  • celeryd: You can now set the DJANGO_PROJECT_DIR variable for celeryd and it will add that to sys.path for easy launching.

  • Can now check if a task has been executed or not via HTTP.

  • You can do this by including the celery urls.py into your project,

    >>> url(r'^celery/$', include("celery.urls"))
    

    then visiting the following url,:

    http://mysite/celery/$task_id/done/

    this will return a JSON dictionary like e.g:

    >>> {"task": {"id": $task_id, "executed": true}}
  • delay_task now returns string id, not uuid.UUID instance.

  • Now has PeriodicTasks, to have cron like functionality.

  • Project changed name from crunchy to celery. The details of the name change request is in docs/name_change_request.txt.

0.1.0 [2009-04-24 11:28 A.M CET]

  • Initial release