Browse Source

Docs: Learn you some Sphinx and use ~

Ask Solem 15 years ago
parent
commit
9f103669cc
2 changed files with 36 additions and 25 deletions
  1. 15 6
      docs/userguide/executing.rst
  2. 21 19
      docs/userguide/tasks.rst

+ 15 - 6
docs/userguide/executing.rst

@@ -2,7 +2,8 @@
  Executing Tasks
 =================
 
-Executing tasks is done with ``apply_async``, and its shortcut: ``delay``.
+Executing tasks is done with :meth:`~celery.task.Base.Task.apply_async`,
+and its shortcut: :meth:`~celery.task.Base.Task.delay`.
 
 ``delay`` is simple and convenient, as it looks like calling a regular
 function:
@@ -17,7 +18,15 @@ The same thing using ``apply_async`` is written like this:
 
     Task.apply_async(args=[arg1, arg2], kwargs={"kwarg1": "x", "kwarg2": "y"})
 
-But ``delay`` doesn't give you as much control as using ``apply_async``.
+You can also execute a task by name if you don't have access to the task
+class::
+
+    >>> from celery.execute import apply_async
+    >>> res = apply_async("tasks.add", [2, 2])
+    >>> res.get()
+    4
+
+While ``delay`` is convenient, it doesn't give you as much control as using ``apply_async``.
 With ``apply_async`` you can override the execution options available as attributes on
 the ``Task`` class: ``routing_key``, ``exchange``, ``immediate``, ``mandatory``,
 ``priority``, and ``serializer``.  In addition you can set a countdown/eta, or provide
@@ -49,10 +58,10 @@ a shortcut to set this by seconds in the future.
 Note that your task is guaranteed to be executed at some time *after* the
 specified date and time has passed, but not necessarily at that exact time.
 
-While ``countdown`` is an integer, ``eta`` must be a ``datetime`` object,
+While ``countdown`` is an integer, ``eta`` must be a :class:`~datetime.datetime` object,
 specifying an exact date and time in the future. This is good if you already
-have a ``datetime`` object and need to modify it with a ``timedelta``, or when
-using time in seconds is not very readable.
+have a :class:`~datetime.datetime`` object and need to modify it with a
+:class:`~datetime.timedelta`, or when using time in seconds is not very readable.
 
 .. code-block:: python
 
@@ -71,7 +80,7 @@ Serializers
 Data passed between celery and workers has to be serialized to be
 transferred. The default serializer is :mod:`pickle`, but you can 
 change this for each
-task. There is built-in support for using ``pickle``, ``JSON`` and ``YAML``,
+task. There is built-in support for using :mod:`pickle`, ``JSON`` and ``YAML``,
 and you can add your own custom serializers by registering them into the
 carrot serializer registry.
 

+ 21 - 19
docs/userguide/tasks.rst

@@ -17,7 +17,7 @@ Given a function ``create_user``, that takes two arguments: ``username`` and
             create_user(username, password)
 
 For convenience there is a shortcut decorator that turns any function into
-a task, ``celery.decorators.task``:
+a task, :func:`celery.decorators.task`:
 
 .. code-block:: python
 
@@ -28,7 +28,8 @@ a task, ``celery.decorators.task``:
     def create_user(username, password):
         User.objects.create(username=username, password=password)
 
-The task decorator takes the same execution options the ``Task`` class does:
+The task decorator takes the same execution options as the
+:class:`~celery.task.base.Task` class does:
 
 .. code-block:: python
 
@@ -36,9 +37,8 @@ The task decorator takes the same execution options the ``Task`` class does:
     def create_user(username, password):
         User.objects.create(username=username, password=password)
 
-
 An alternative way to use the decorator is to give the function as an argument
-instead, but if you do this be sure to set the resulting tasks ``__name__``
+instead, but if you do this be sure to set the resulting tasks :attr:`__name__`
 attribute, so pickle is able to find it in reverse:
 
 .. code-block:: python
@@ -58,8 +58,9 @@ The current default keyword arguments are:
 
 * logfile
 
-    The log file, can be passed on to ``self.get_logger``
-    to gain access to the workers log file. See `Logging`_.
+    The log file, can be passed on to
+    :meth:`~celery.task.base.Task.get_logger` to gain access to
+    the workers log file. See `Logging`_.
 
 * loglevel
 
@@ -80,14 +81,15 @@ The current default keyword arguments are:
 
 * task_is_eager
 
-    Set to ``True`` if the task is executed locally in the client,
+    Set to :const:`True` if the task is executed locally in the client,
     and not by a worker.
 
 * delivery_info
 
   Additional message delivery information. This is a mapping containing
   the exchange and routing key used to deliver this task. It's used
-  by e.g. :meth:`retry` to resend the task to the same destination queue.
+  by e.g. :meth:`~celery.task.base.Task.retry` to resend the task to the
+  same destination queue.
 
   **NOTE** As some messaging backends doesn't have advanced routing
   capabilities, you can't trust the availability of keys in this mapping.
@@ -124,9 +126,9 @@ setting decides whether or not they will be written to the log file.
 Retrying a task if something fails
 ==================================
 
-Simply use :meth:`Task.retry` to re-send the task. It will
-do the right thing, and respect the :attr:`Task.max_retries`
-attribute:
+Simply use :meth:`~celery.task.base.Task.retry` to re-send the task.
+It will do the right thing, and respect the
+:attr:`~celery.task.base.Task.max_retries` attribute:
 
 .. code-block:: python
 
@@ -161,7 +163,7 @@ attribute on the task. By default this is set to 3 minutes. Note that the
 unit for setting the delay is in seconds (int or float).
 
 You can also provide the ``countdown`` argument to
-:meth:`Task.retry` to override this default.
+:meth:`~celery.task.base.Task.retry` to override this default.
 
 .. code-block:: python
 
@@ -195,7 +197,7 @@ Task options
 * max_retries
 
     The maximum number of attempted retries before giving up.
-    If this is exceeded the :exc`celery.execptions.MaxRetriesExceeded`
+    If this is exceeded the :exc`~celery.execptions.MaxRetriesExceeded`
     exception will be raised. Note that you have to retry manually, it's
     not something that happens automatically.
 
@@ -433,9 +435,10 @@ The default loader imports any modules listed in the
 ``CELERY_IMPORTS`` setting. 
 
 The entity responsible for registering your task in the registry is a
-meta class, :class:`TaskType`. This is the default meta class for
-``Task``. If you want to register your task manually you can set the
-``abstract`` attribute:
+meta class, :class:`~celery.task.base.TaskType`. This is the default
+meta class for :class:`~celery.task.base.Task`. If you want to register
+your task manually you can set the :attr:`~celery.task.base.Task.abstract`
+attribute:
 
 .. code-block:: python
 
@@ -459,7 +462,8 @@ Ignore results you don't want
 -----------------------------
 
 If you don't care about the results of a task, be sure to set the
-``ignore_result`` option, as storing results wastes time and resources.
+:attr:`~celery.task.base.Task.ignore_result` option, as storing results
+wastes time and resources.
 
 .. code-block:: python
 
@@ -546,8 +550,6 @@ Good:
 
 
 
-
-
 Performance and Strategies
 ==========================