Ver Fonte

Merge branch 'master' of ssh://github.com/ask/celery

Vincent Driessen há 14 anos atrás
pai
commit
b116de48a7

+ 39 - 8
Changelog

@@ -5,6 +5,21 @@
 1.2.0 [xxxx-xx-xx xx:xx x.x xxxx]
 1.2.0 [xxxx-xx-xx xx:xx x.x xxxx]
 =================================
 =================================
 
 
+Celery 1.2 contains backward incompatible changes, the most important
+being that the Django dependency has been removed, so Celery no longer
+supports Django out of the box, but instead as an add-on package
+called `django-celery`_.
+
+We're very sorry for breaking backwards compatibility, but there's
+also many new and exciting features to make up for the time you lose
+upgrading, so be sure to read the :ref:`News <120news>` section.
+
+Quite a lot of potential users have been upset about the Django dependency,
+so maybe this is a chance to get wider adoption by the Python community as
+well.
+
+Big thanks to all contributors, testers and users!
+
 Upgrading for Django-users
 Upgrading for Django-users
 --------------------------
 --------------------------
 
 
@@ -141,6 +156,8 @@ Backward incompatible changes
 
 
         CELERY_LOADER = "myapp.loaders.Loader"
         CELERY_LOADER = "myapp.loaders.Loader"
 
 
+.. _120news:
+
 News
 News
 ----
 ----
 
 
@@ -177,7 +194,7 @@ News
     =====================================  =====================================
     =====================================  =====================================
     **Module name**                        **celery equivalent**
     **Module name**                        **celery equivalent**
     =====================================  =====================================
     =====================================  =====================================
-    ``billiard.pool``                      ``celery.concurrency.processes``
+    ``billiard.pool``                      ``celery.concurrency.processes.pool``
     ``billiard.serialization``             ``celery.serialization``
     ``billiard.serialization``             ``celery.serialization``
     ``billiard.utils.functional``          ``celery.utils.functional``
     ``billiard.utils.functional``          ``celery.utils.functional``
     =====================================  =====================================
     =====================================  =====================================
@@ -188,16 +205,16 @@ News
 
 
 * now depends on :mod:`pyparsing`
 * now depends on :mod:`pyparsing`
 
 
-* Added support for using complex crontab-expressions in periodic tasks.  For
+* Added support for using complex crontab-expressions in periodic tasks. For
   example, you can now use::
   example, you can now use::
-  
-    crontab(minute="*/15")
+
+    >>> crontab(minute="*/15")
 
 
   or even::
   or even::
 
 
-    crontab(minute="*/30", hour="8-17,1-2", day_of_week="thu-fri")
+    >>> crontab(minute="*/30", hour="8-17,1-2", day_of_week="thu-fri")
 
 
-  See also http://ask.github.com/celery/getting-started/periodic-tasks.html
+  See :doc:`getting-started/periodic-tasks`.
 
 
 * celeryd: Now waits for available pool processes before applying new
 * celeryd: Now waits for available pool processes before applying new
   tasks to the pool.
   tasks to the pool.
@@ -437,7 +454,7 @@ News
         celeryd-multi -n baz.myhost -c 10
         celeryd-multi -n baz.myhost -c 10
         celeryd-multi -n xuzzy.myhost -c 3
         celeryd-multi -n xuzzy.myhost -c 3
 
 
-1.0.4 [2010-05-31 09:54 A.M CEST]
+1.0.5 [2010-06-01 02:36 P.M CEST]
 =================================
 =================================
 
 
 Critical
 Critical
@@ -454,6 +471,12 @@ Critical
 
 
 * Now depends on :mod:`billiard` >= 0.3.1
 * Now depends on :mod:`billiard` >= 0.3.1
 
 
+* celeryd: Previously exceptions raised by worker components could stall startup,
+  now it correctly logs the exceptions and shuts down.
+
+* celeryd: Prefetch counts was set too late. QoS is now set as early as possible,
+  so celeryd can't slurp in all the messages at start-up.
+
 Changes
 Changes
 -------
 -------
 
 
@@ -462,6 +485,9 @@ Changes
     Tasks that defines steps of execution, the task can then
     Tasks that defines steps of execution, the task can then
     be aborted after each step has completed.
     be aborted after each step has completed.
 
 
+* :class:`~celery.events.EventDispatcher`: No longer creates AMQP channel
+  if events are disabled
+
 * Added required RPM package names under ``[bdist_rpm]`` section, to support building RPMs
 * Added required RPM package names under ``[bdist_rpm]`` section, to support building RPMs
   from the sources using setup.py
   from the sources using setup.py
 
 
@@ -481,6 +507,11 @@ Changes
     * Should I use retry or acks_late?
     * Should I use retry or acks_late?
     * Can I execute a task by name?
     * Can I execute a task by name?
 
 
+1.0.4 [2010-05-31 09:54 A.M CEST]
+=================================
+
+* Changlog merged with 1.0.5 as the release was never announced.
+
 1.0.3 [2010-05-15 03:00 P.M CEST]
 1.0.3 [2010-05-15 03:00 P.M CEST]
 =================================
 =================================
 
 
@@ -767,7 +798,7 @@ Fixes
 
 
     .. code-block:: python
     .. code-block:: python
 
 
-        CELERYD_POOL = "celery.worker.pool.TaskPool"
+        CELERYD_POOL = "celery.concurrency.processes.TaskPool"
         CELERYD_MEDIATOR = "celery.worker.controllers.Mediator"
         CELERYD_MEDIATOR = "celery.worker.controllers.Mediator"
         CELERYD_ETA_SCHEDULER = "celery.worker.controllers.ScheduleController"
         CELERYD_ETA_SCHEDULER = "celery.worker.controllers.ScheduleController"
         CELERYD_LISTENER = "celery.worker.listener.CarrotListener"
         CELERYD_LISTENER = "celery.worker.listener.CarrotListener"

+ 2 - 1
celery/worker/pool.py → celery/concurrency/processes/__init__.py

@@ -5,10 +5,11 @@ Process Pools.
 """
 """
 
 
 from celery import log
 from celery import log
-from celery.concurrency.processes import Pool, RUN
 from celery.datastructures import ExceptionInfo
 from celery.datastructures import ExceptionInfo
 from celery.utils.functional import curry
 from celery.utils.functional import curry
 
 
+from celery.concurrency.processes.pool import Pool, RUN
+
 
 
 class TaskPool(object):
 class TaskPool(object):
     """Process Pool for processing tasks in parallel.
     """Process Pool for processing tasks in parallel.

+ 2 - 7
celery/concurrency/processes.py → celery/concurrency/processes/pool.py

@@ -24,6 +24,8 @@ import signal
 from multiprocessing import Process, cpu_count, TimeoutError
 from multiprocessing import Process, cpu_count, TimeoutError
 from multiprocessing.util import Finalize, debug
 from multiprocessing.util import Finalize, debug
 
 
+from celery.exceptions import SoftTimeLimitExceeded, TimeLimitExceeded
+
 #
 #
 # Constants representing the state of a pool
 # Constants representing the state of a pool
 #
 #
@@ -48,13 +50,6 @@ def mapstar(args):
 # Code run by worker processes
 # Code run by worker processes
 #
 #
 
 
-class TimeLimitExceeded(Exception):
-    """The time limit has been exceeded and the job has been terminated."""
-
-class SoftTimeLimitExceeded(Exception):
-    """The soft time limit has been exceeded. This exception is raised
-    to give the job a chance to clean up."""
-
 def soft_timeout_sighandler(signum, frame):
 def soft_timeout_sighandler(signum, frame):
     raise SoftTimeLimitExceeded()
     raise SoftTimeLimitExceeded()
 
 

+ 1 - 1
celery/conf.py

@@ -45,7 +45,7 @@ _DEFAULTS = {
     "CELERY_BROKER_CONNECTION_MAX_RETRIES": 100,
     "CELERY_BROKER_CONNECTION_MAX_RETRIES": 100,
     "CELERY_ACKS_LATE": False,
     "CELERY_ACKS_LATE": False,
     "CELERYD_POOL_PUTLOCKS": True,
     "CELERYD_POOL_PUTLOCKS": True,
-    "CELERYD_POOL": "celery.worker.pool.TaskPool",
+    "CELERYD_POOL": "celery.concurrency.processes.TaskPool",
     "CELERYD_MEDIATOR": "celery.worker.controllers.Mediator",
     "CELERYD_MEDIATOR": "celery.worker.controllers.Mediator",
     "CELERYD_ETA_SCHEDULER": "celery.worker.controllers.ScheduleController",
     "CELERYD_ETA_SCHEDULER": "celery.worker.controllers.ScheduleController",
     "CELERYD_LISTENER": "celery.worker.listener.CarrotListener",
     "CELERYD_LISTENER": "celery.worker.listener.CarrotListener",

+ 5 - 2
celery/exceptions.py

@@ -3,7 +3,6 @@
 Common Exceptions
 Common Exceptions
 
 
 """
 """
-from celery.concurrency.processes import SoftTimeLimitExceeded as _STLE
 
 
 UNREGISTERED_FMT = """
 UNREGISTERED_FMT = """
 Task of kind %s is not registered, please make sure it's imported.
 Task of kind %s is not registered, please make sure it's imported.
@@ -14,7 +13,11 @@ class RouteNotFound(KeyError):
     """Task routed to a queue not in the routing table (CELERY_QUEUES)."""
     """Task routed to a queue not in the routing table (CELERY_QUEUES)."""
 
 
 
 
-class SoftTimeLimitExceeded(_STLE):
+class TimeLimitExceeded(Exception):
+    """The time limit has been exceeded and the job has been terminated."""
+
+
+class SoftTimeLimitExceeded(Exception):
     """The soft time limit has been exceeded. This exception is raised
     """The soft time limit has been exceeded. This exception is raised
     to give the task a chance to clean up."""
     to give the task a chance to clean up."""
     pass
     pass

+ 1 - 1
celery/task/base.py

@@ -485,7 +485,7 @@ class Task(object):
         """The method the worker calls to execute the task.
         """The method the worker calls to execute the task.
 
 
         :param wrapper: A :class:`celery.worker.job.TaskWrapper`.
         :param wrapper: A :class:`celery.worker.job.TaskWrapper`.
-        :param pool: A :class:`celery.worker.pool.TaskPool` object.
+        :param pool: A task pool.
         :param loglevel: Current loglevel.
         :param loglevel: Current loglevel.
         :param logfile: Name of the currently used logfile.
         :param logfile: Name of the currently used logfile.
 
 

+ 5 - 4
celery/tests/test_pool.py

@@ -1,10 +1,11 @@
-import unittest2 as unittest
+import sys
+import time
 import logging
 import logging
 import itertools
 import itertools
-import time
-from celery.worker.pool import TaskPool
+import unittest2 as unittest
+
+from celery.concurrency.processes import TaskPool
 from celery.datastructures import ExceptionInfo
 from celery.datastructures import ExceptionInfo
-import sys
 
 
 
 
 def do_something(i):
 def do_something(i):

+ 1 - 1
celery/tests/test_worker_job.py

@@ -13,7 +13,7 @@ from celery.task.base import Task
 from celery.utils import gen_unique_id
 from celery.utils import gen_unique_id
 from celery.result import AsyncResult
 from celery.result import AsyncResult
 from celery.worker.job import WorkerTaskTrace, TaskWrapper
 from celery.worker.job import WorkerTaskTrace, TaskWrapper
-from celery.worker.pool import TaskPool
+from celery.concurrency.processes import TaskPool
 from celery.backends import default_backend
 from celery.backends import default_backend
 from celery.exceptions import RetryTaskError, NotRegistered
 from celery.exceptions import RetryTaskError, NotRegistered
 from celery.decorators import task as task_dec
 from celery.decorators import task as task_dec

+ 7 - 7
celery/utils/__init__.py

@@ -207,23 +207,23 @@ def get_cls_by_name(name, aliases={}):
 
 
     Example::
     Example::
 
 
-        celery.worker.pool.TaskPool
-                           ^- class name
+        celery.concurrency.processes.TaskPool
+                                    ^- class name
 
 
     If ``aliases`` is provided, a dict containing short name/long name
     If ``aliases`` is provided, a dict containing short name/long name
     mappings, the name is looked up in the aliases first.
     mappings, the name is looked up in the aliases first.
 
 
     Examples:
     Examples:
 
 
-        >>> get_cls_by_name("celery.worker.pool.TaskPool")
-        <class 'celery.worker.pool.TaskPool'>
+        >>> get_cls_by_name("celery.concurrency.processes.TaskPool")
+        <class 'celery.concurrency.processes.TaskPool'>
 
 
         >>> get_cls_by_name("default", {
         >>> get_cls_by_name("default", {
-        ...     "default": "celery.worker.pool.TaskPool"})
-        <class 'celery.worker.pool.TaskPool'>
+        ...     "default": "celery.concurrency.processes.TaskPool"})
+        <class 'celery.concurrency.processes.TaskPool'>
 
 
         # Does not try to look up non-string names.
         # Does not try to look up non-string names.
-        >>> from celery.worker.pool import TaskPool
+        >>> from celery.concurrency.processes import TaskPool
         >>> get_cls_by_name(TaskPool) is TaskPool
         >>> get_cls_by_name(TaskPool) is TaskPool
         True
         True
 
 

+ 1 - 1
docs/configuration.rst

@@ -588,7 +588,7 @@ Custom Component Classes (advanced)
 * CELERYD_POOL
 * CELERYD_POOL
 
 
     Name of the task pool class used by the worker.
     Name of the task pool class used by the worker.
-    Default is ``"celery.worker.pool.TaskPool"``.
+    Default is ``"celery.concurrency.processes.TaskPool"``.
 
 
 * CELERYD_LISTENER
 * CELERYD_LISTENER
 
 

+ 9 - 0
docs/internals/reference/celery.concurrency.processes.pool.rst

@@ -0,0 +1,9 @@
+===================================================================
+ extended multiprocessing.pool - celery.concurrency.processes.pool
+===================================================================
+
+.. currentmodule:: celery.concurrency.processes.pool
+
+.. automodule:: celery.concurrency.processes.pool
+    :members:
+    :undoc-members:

+ 0 - 9
docs/internals/reference/celery.worker.pool.rst

@@ -1,9 +0,0 @@
-================================
- Task Pool - celery.worker.pool
-================================
-
-.. currentmodule:: celery.worker.pool
-
-.. automodule:: celery.worker.pool
-    :members:
-    :undoc-members:

+ 3 - 3
docs/internals/reference/index.rst

@@ -14,12 +14,14 @@
     celery.worker.controllers
     celery.worker.controllers
     celery.worker.buckets
     celery.worker.buckets
     celery.worker.scheduler
     celery.worker.scheduler
-    celery.worker.pool
     celery.worker.heartbeat
     celery.worker.heartbeat
     celery.worker.control
     celery.worker.control
     celery.worker.control.builtins
     celery.worker.control.builtins
     celery.worker.control.registry
     celery.worker.control.registry
     celery.worker.revoke
     celery.worker.revoke
+    celery.concurrency.processes
+    celery.concurrency.processes.pool
+    celery.concurrency.threads
     celery.beat
     celery.beat
     celery.backends
     celery.backends
     celery.backends.base
     celery.backends.base
@@ -29,8 +31,6 @@
     celery.backends.pyredis
     celery.backends.pyredis
     celery.backends.tyrant
     celery.backends.tyrant
     celery.execute.trace
     celery.execute.trace
-    celery.concurrency.processes
-    celery.concurrency.threads
     celery.serialization
     celery.serialization
     celery.datastructures
     celery.datastructures
     celery.routes
     celery.routes

+ 1 - 1
docs/reference/celery.conf.rst

@@ -235,7 +235,7 @@ Configuration - celery.conf
 .. data:: CELERYD_POOL
 .. data:: CELERYD_POOL
 
 
     Name of the task pool class used by the worker.
     Name of the task pool class used by the worker.
-    Default is ``"celery.worker.pool.TaskPool"``.
+    Default is ``"celery.concurrency.processes.TaskPool"``.
 
 
 .. data:: CELERYD_LISTENER
 .. data:: CELERYD_LISTENER