Ver Fonte

Consistently use the term 'call' to refer to sending task messages

Ask Solem há 12 anos atrás
pai
commit
2273e2a086

+ 3 - 3
Changelog

@@ -753,7 +753,7 @@ News
 * Automatic connection pool support.
 
     The pool is used by everything that requires a broker connection.  For
-    example applying tasks, sending broadcast commands, retrieving results
+    example calling tasks, sending broadcast commands, retrieving results
     with the AMQP result backend, and so on.
 
     The pool is disabled by default, but you can enable it by configuring the
@@ -1821,7 +1821,7 @@ News
     If enabled an event will be sent with every task, so monitors can
     track tasks before the workers receive them.
 
-* `celerybeat`: Now reuses the broker connection when applying
+* `celerybeat`: Now reuses the broker connection when calling
    scheduled tasks.
 
 * The configuration module and loader to use can now be specified on
@@ -4990,7 +4990,7 @@ News
   restarted if it crashes). To use this start celeryd with the
   --supervised` option (or alternatively `-S`).
 
-* views.apply: View applying a task. Example
+* views.apply: View calling a task. Example
 
     ::
 

+ 6 - 6
celery/bin/celery.py

@@ -39,7 +39,7 @@ commands = {}
 command_classes = (
     ('Main', ['worker', 'events', 'beat', 'shell', 'multi', 'amqp'], 'green'),
     ('Remote Control', ['status', 'inspect', 'control'], 'blue'),
-    ('Utils', ['purge', 'list', 'migrate', 'apply', 'result', 'report'], None),
+    ('Utils', ['purge', 'list', 'migrate', 'call', 'result', 'report'], None),
 )
 
 
@@ -316,13 +316,13 @@ class list_(Command):
 list_ = command(list_, 'list')
 
 
-class apply(Command):
-    """Apply a task by name.
+class call(Command):
+    """Call a task by name.
 
     Examples::
 
-        celery apply tasks.add --args='[2, 2]'
-        celery apply tasks.add --args='[2, 2]' --countdown=10
+        celery call tasks.add --args='[2, 2]'
+        celery call tasks.add --args='[2, 2]' --countdown=10
     """
     args = '<task_name>'
     option_list = Command.option_list + (
@@ -369,7 +369,7 @@ class apply(Command):
                                  eta=maybe_iso8601(kw.get('eta')),
                                  expires=expires)
         self.out(res.id)
-apply = command(apply)
+call = command(call)
 
 
 class purge(Command):

+ 2 - 2
docs/faq.rst

@@ -562,8 +562,8 @@ Tasks
 
 .. _faq-tasks-connection-reuse:
 
-How can I reuse the same connection when applying tasks?
---------------------------------------------------------
+How can I reuse the same connection when calling tasks?
+-------------------------------------------------------
 
 **Answer**: See the :setting:`BROKER_POOL_LIMIT` setting.
 The connection pool is enabled by default since version 2.5.

+ 6 - 6
docs/getting-started/first-steps-with-celery.rst

@@ -100,7 +100,7 @@ the entry-point for everything you want to do in Celery, like creating tasks and
 managing workers, it must be possible for other modules to import it.
 
 In this tutorial we will keep everything contained in a single module,
-but for larger projects you probably want to create
+but for larger projects you want to create
 a :ref:`dedicated module <project-layout>`.
 
 Let's create the file :file:`tasks.py`:
@@ -136,12 +136,12 @@ We defined a single task, called ``add``, which returns the sum of two numbers.
 Running the celery worker server
 ================================
 
-We can now run the worker by executing our program with the ``worker``
+We now run the worker by executing our program with the ``worker``
 argument::
 
     $ python tasks.py worker --loglevel=info
 
-In production you will probably want to run the worker in the
+In production you will want to run the worker in the
 background as a daemon.  To do this you need to use the tools provided
 by your platform, or something like `supervisord`_ (see :ref:`daemonizing`
 for more information).
@@ -170,10 +170,10 @@ method which gives greater control of the task execution (see
     >>> from tasks import add
     >>> add.delay(4, 4)
 
-The task should now be processed by the worker you started earlier,
+The task has now been processed by the worker you started earlier,
 and you can verify that by looking at the workers console output.
 
-Applying a task returns an :class:`~@AsyncResult` instance,
+Calling a task returns an :class:`~@AsyncResult` instance,
 which can be used to check the state of the task, wait for the task to finish
 or get its return value (or if the task failed, the exception and traceback).
 But this isn't enabled by default, and you have to configure Celery to
@@ -210,7 +210,7 @@ To read more about result backends please see :ref:`task-result-backends`.
 
 Now with the result backend configured, let's call the task again.
 This time we'll hold on to the :class:`~@AsyncResult` instance returned
-when you apply a task::
+when you call a task::
 
     >>> result = add.delay(4, 4)
 

+ 1 - 1
docs/reference/celery.rst

@@ -276,7 +276,7 @@ Grouping Tasks
         >>> res.get()
         8
 
-    Applying a chain will return the result of the last task in the chain.
+    Calling a chain will return the result of the last task in the chain.
     You can get to the other tasks by following the ``result.parent``'s::
 
         >>> res.parent.get()

+ 1 - 1
docs/reference/celery.utils.debug.rst

@@ -11,7 +11,7 @@ Sampling Memory Usage
 This module can be used to diagnose and sample the memory usage
 used by parts of your application.
 
-E.g to sample the memory usage of applying tasks you can do this:
+E.g to sample the memory usage of calling tasks you can do this:
 
 .. code-block:: python
 

+ 7 - 7
docs/userguide/canvas.rst

@@ -105,7 +105,7 @@ The ``.si()`` shortcut can also be used to create immutable subtasks::
     >>> add.apply_async((2, 2), link=reset_buffers.si())
 
 Only the execution options can be set when a subtask is immutable,
-and it's not possible to apply the subtask with partial args/kwargs.
+so it's not possible to call the subtask with partial args/kwargs.
 
 .. note::
 
@@ -237,7 +237,7 @@ you chain tasks together:
     proj.tasks.add(4, 4) | proj.tasks.mul(8)
 
 
-Calling the chain will apply the tasks in the current process
+Calling the chain will call the tasks in the current process
 and return the result of the last task in the chain::
 
     >>> res = chain(add.s(4, 4), mul.s(8), mul.s(10))
@@ -245,7 +245,7 @@ and return the result of the last task in the chain::
     640
 
 And calling ``apply_async`` will create a dedicated
-task so that the act of applying the chain happens
+task so that the act of calling the chain happens
 in a worker::
 
     >>> res = chain(add.s(4, 4), mul.s(8), mul.s(10))
@@ -324,7 +324,7 @@ or tell how many tasks are ready and so on::
     [4, 8]
 
 However, if you call ``apply_async`` on the group it will
-send a special grouping task, so that the action of applying
+send a special grouping task, so that the action of calling
 the tasks happens in a worker instead of the current process::
 
     >>> res = g.apply_async()
@@ -479,7 +479,7 @@ Important Notes
 ---------------
 
 By default the synchronization step is implemented by having a recurring task
-poll the completion of the taskset every second, applying the subtask when
+poll the completion of the taskset every second, calling the subtask when
 ready.
 
 Example implementation:
@@ -558,7 +558,7 @@ is the same as having a task doing:
 
 Both ``map`` and ``starmap`` are subtasks, so they can be used as
 other subtasks and combined in groups etc., for example
-to apply the starmap after 10 seconds::
+to call the starmap after 10 seconds::
 
     >>> add.starmap(zip(range(10), range(10))).apply_async(countdown=10)
 
@@ -583,7 +583,7 @@ To create a chunks subtask you can use :meth:`@Task.chunks`:
     >>> add.chunks(zip(range(100), range(100)), 10)
 
 As with :class:`~celery.group` the act of **calling**
-the chunks will apply the tasks in the current process:
+the chunks will call the tasks in the current process:
 
 .. code-block:: python
 

+ 1 - 1
docs/userguide/remote-tasks.rst

@@ -118,7 +118,7 @@ task being executed::
     [INFO/MainProcess] Task celery.task.http.HttpDispatchTask
             [f2cc8efc-2a14-40cd-85ad-f1c77c94beeb] processed: 100
 
-Since applying tasks can be done via HTTP using the
+Since calling tasks can be done via HTTP using the
 :func:`djcelery.views.apply` view, calling tasks from other languages is easy.
 For an example service exposing tasks via HTTP you should have a look at
 `examples/celery_http_gateway` in the Celery distribution:

+ 2 - 2
docs/whatsnew-2.6.rst

@@ -341,7 +341,7 @@ Immutable subtasks
 ------------------
 
 ``subtask``'s can now be immutable, which means that the arguments
-will not be modified when applying callbacks::
+will not be modified when calling callbacks::
 
     >>> chain(add.s(2, 2), clear_static_electricity.si())
 
@@ -583,7 +583,7 @@ In Other News
 
 - ``xmap(task, sequence)`` and ``xstarmap(task, sequence)``
 
-    Returns a list of the results applying the task to every item
+    Returns a list of the results applying the task function to every item
     in the sequence.
 
     Example::