| 123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123 | 
							- ==============
 
- Change history
 
- ==============
 
- 0.1.12 :date:`2009-05-18 04:38 P.M CET` :author:askh@opera.com
 
-     * delay_task() etc. now returns :class:`celery.task.AsyncResult` object,
 
-     which lets you check the result and any failure that might have happened.
 
-     It kind of works like the ``multiprocessing.AsyncResult`` class returned
 
-     by ``multiprocessing.Pool.map_async``.
 
-     * Added dmap() and dmap_async(). This works like the
 
-     * ``multiprocessing.Pool`` versions except they are tasks distributed
 
-     to the celery server. Example:
 
-         >>> from celery.task import dmap
 
-         >>> import operator
 
-         >>> dmap(operator.add, [[2, 2], [4, 4], [8, 8]])
 
-         >>> [4, 8, 16]
 
-         
 
-         >>> from celery.task import dmap_async
 
-         >>> import operator
 
-         >>> result = dmap_async(operator.add, [[2, 2], [4, 4], [8, 8]])
 
-         >>> result.ready()
 
-         False
 
-         >>> time.sleep(1)
 
-         >>> result.ready()
 
-         True
 
-         >>> result.result
 
-         [4, 8, 16]
 
-     * Refactored the task metadata cache and database backends, and added
 
-     a new backend for Tokyo Tyrant. You can set the backend in your django
 
-     settings file. e.g
 
-         CELERY_BACKEND = "database"; # Uses the database
 
-         CELERY_BACKEND = "cache"; # Uses the django cache framework
 
-         CELERY_BACKEND = "tyrant"; # Uses Tokyo Tyrant
 
-         TT_HOST = "localhost"; # Hostname for the Tokyo Tyrant server.
 
-         TT_PORT = 6657; # Port of the Tokyo Tyrant server.
 
- 0.1.11 :date:`2009-05-12 02:08 P.M CET` :author:askh@opera.com
 
- 	* The logging system was leaking file descriptors, resulting in servers
 
- 	stopping with the EMFILES (too many open files) error. (fixed)
 
- 0.1.10 :date:`2009-05-11 12:46 P.M CET` :author:askh@opera.com
 
- 	* Tasks now supports both positional arguments and keyword arguments.
 
- 	* Requires carrot 0.3.8.
 
- 	* The daemon now tries to reconnect if the connection is lost.
 
- 0.1.8 :date:`2009-05-07 12:27 P.M CET` :author:askh@opera.com
 
- 	* Better test coverage
 
- 	* More documentation
 
- 	* celeryd doesn't emit "Queue is empty" message if
 
- 	  :setting:`CELERYD_EMPTY_MSG_EMIT_EVERY` is 0.
 
- 0.1.7 :date:`2009-04-30 1:50 P.M CET` :author:askh@opera.com
 
- 	* Added some unittests
 
- 	* Can now use the database for task metadata (like if the task
 
- 	has been executed or not). Set :setting:`CELERY_TASK_META`
 
- 	* Can now run ``python setup.py test`` to run the unittests from within
 
- 	the ``testproj`` project.
 
- 	* Can set the AMQP exchange/routing key/queue using
 
- 	:setting:`CELERY_AMQP_EXCHANGE`, :setting:`CELERY_AMQP_ROUTING_KEY`,
 
- 	and :setting:`CELERY_AMQP_CONSUMER_QUEUE`.
 
- 0.1.6 :date:`2009-04-28 2:13 P.M CET` :author:askh@opera.com
 
- 	* Introducing ``TaskSet``. A set of subtasks is executed and you can find
 
- 	out how many, or if all them, are done (excellent for progress bars and
 
- 	such)
 
- 	* Now catches all exceptions when running ``Task.__call__``, so the daemon
 
- 	doesn't die. This does't happen for pure functions yet, only
 
- 	:class:`Task` classes.
 
- 	* ``autodiscover()`` now works with zipped eggs.
 
- 	* celeryd: Now adds curernt working directory to ``sys.path`` for
 
- 	convenience.
 
- 	* The ``run_every`` attribute of :class:`PeriodicTask` classes can now
 
- 	  be a :class:`datetime.timedelta()` object.
 
- 	* celeryd: You can now set the ``DJANGO_PROJECT_DIR`` variable for
 
- 	``celeryd`` and it will add that to ``sys.path`` for easy launching.
 
- 	* Can now check if a task has been executed or not via HTTP.
 
- 	You can do this by including the celery ``urls.py`` into your project,
 
- 		>>> url(r'^celery/$', include("celery.urls"))
 
- 	then visiting the following url,::
 
- 		http://mysite/celery/$task_id/done/
 
- 	this will return a JSON dictionary like e.g:
 
- 		>>> {"task": {"id": $task_id, "executed": true}}
 
- 	* ``delay_task`` now returns string id, not :class:`uuid.UUID` instance.
 
- 	* Now has ``PeriodicTasks``, to have ``cron`` like functionality.
 
- 	* Project changed name from ``crunchy`` to ``celery``. The details of
 
- 	the name change request is in ``docs/name_change_request.txt``.
 
- 0.1.0 :date:`2009-04-24 11:28 A.M CET` :author:askh@opera.com
 
- --------------------------------------------------------------
 
- 	* Initial release
 
 
  |