Tasks - celery.task

class celery.task.AsynchronousMapTask
Task used internally by dmap_async.
class celery.task.DeleteExpiredTaskMetaTask

A periodic task that deletes expired task metadata every day.

This runs the current backend’s cleanup() method.

class celery.task.ExecuteRemoteTask

Execute arbitrary function/object.

The object must be pickleable, so you can’t use lambdas or functions defined in the REPL.

run(ser_callable, fargs, fkwargs, **kwargs)
Execute the pickled ser_callable, with fargs as positional arguments and fkwargs as keyword arguments.
class celery.task.PeriodicTask

A periodic task is a task that behaves like a cron job.

The run_every attribute defines how often the task is run (its interval), it can be either a datetime.timedelta object or a integer specifying the time in seconds.

You have to register the periodic task in the task registry.

>>> from celery.task import tasks, PeriodicTask
>>> from datetime import timedelta
>>> class MyPeriodicTask(PeriodicTask):
...     name = "my_periodic_task"
...     run_every = timedelta(seconds=30)
...
...     def run(self, **kwargs):
...         logger = self.get_logger(**kwargs)
...         logger.info("Running MyPeriodicTask")
>>> tasks.register(MyPeriodicTask)
class celery.task.Task

A task that can be delayed for execution by the celery daemon.

All subclasses of Task has to define the name attribute, which is the name of the task that can be passed to celery.task.delay_task, it also has to define the run method, which is the actual method the celery daemon executes.

This is a simple task just logging a message,

>>> from celery.task import tasks, Task
>>> class MyTask(Task):
...     name = "mytask"
...
...     def run(self, some_arg=None, **kwargs):
...         logger = self.get_logger(**kwargs)
...         logger.info("Running MyTask with arg some_arg=%s" %
...                     some_arg))
...         return 42
... tasks.register(MyTask)

You can delay the task using the classmethod delay...

>>> result = MyTask.delay(some_arg="foo")
>>> result.status # after some time
'DONE'
>>> result.result
42

...or using the celery.task.delay_task function, by passing the name of the task.

>>> from celery.task import delay_task
>>> delay_task(MyTask.name, some_arg="foo")
classmethod delay(*args, **kwargs)

Delay this task for execution by the celery daemon(s).

Return type:celery.result.AsyncResult
get_consumer()
Get a celery task message consumer.
get_logger(**kwargs)
Get a process-aware logger object.
get_publisher()
Get a celery task message publisher.
run(*args, **kwargs)
The actual task. All subclasses of Task must define the run method, if not a NotImplementedError exception is raised.
class celery.task.TaskSet(task, args)

A task containing several subtasks, making it possible to track how many, or when all of the tasks are completed.

>>> from djangofeeds.tasks import RefreshFeedTask
>>> taskset = TaskSet(RefreshFeedTask, args=[
...                 {"feed_url": "http://cnn.com/rss"},
...                 {"feed_url": "http://bbc.com/rss"},
...                 {"feed_url": "http://xkcd.com/rss"}])
>>> taskset_id, subtask_ids = taskset.run()
iterate()

Iterate over the results returned after calling run().

If any of the tasks raises an exception, the exception will be reraised by iterate.

join(timeout=None)

Gather the results for all of the tasks in the taskset, and return a list with them ordered by the order of which they were called.

If any of the tasks raises an exception, the exception will be reraised by join.

If timeout is not None and the operation takes longer than timeout seconds, it will raise the celery.timer.TimeoutError exception.

classmethod map(func, args, timeout=None)
Distribute processing of the arguments and collect the results.
classmethod map_async(func, args, timeout=None)

Distribute processing of the arguments and collect the results asynchronously.

Returns celery.result.AsyncResult instance.

classmethod remote_execute(func, args)
Apply args to function by distributing the args to the celery server(s).
run()

Run all tasks in the taskset.

Returns a tuple with the taskset id, and a list of subtask id’s.

>>> ts = RefreshFeeds([
...         ["http://foo.com/rss", {}],
...         ["http://bar.com/rss", {}],
... )
>>> taskset_id, subtask_ids = ts.run()
>>> taskset_id
"d2c9b261-8eff-4bfb-8459-1e1b72063514"
>>> subtask_ids
["b4996460-d959-49c8-aeb9-39c530dcde25",
"598d2d18-ab86-45ca-8b4f-0779f5d6a3cb"]
>>> time.sleep(10)
>>> is_done(taskset_id)
True
celery.task.delay_task(task_name, *args, **kwargs)

Delay a task for execution by the celery daemon.

>>> r = delay_task("update_record", name="George Constanza", age=32)
>>> r.ready()
True
>>> r.result
"Record was updated"

Raises celery.registry.NotRegistered exception if no such task has been registered in the task registry.

Return type:celery.result.AsyncResult.
celery.task.discard_all()

Discard all waiting tasks.

This will ignore all tasks waiting for execution, and they will be deleted from the messaging server.

Returns the number of tasks discarded.

Return type:int
celery.task.dmap(func, args, timeout=None)

Distribute processing of the arguments and collect the results.

>>> from celery.task import map
>>> import operator
>>> dmap(operator.add, [[2, 2], [4, 4], [8, 8]])
[4, 8, 16]
celery.task.dmap_async(func, args, timeout=None)

Distribute processing of the arguments and collect the results asynchronously.

Returns a celery.result.AsyncResult object.

>>> from celery.task import dmap_async
>>> import operator
>>> presult = dmap_async(operator.add, [[2, 2], [4, 4], [8, 8]])
>>> presult
<AsyncResult: 373550e8-b9a0-4666-bc61-ace01fa4f91d>
>>> presult.status
'DONE'
>>> presult.result
[4, 8, 16]
celery.task.execute_remote(func, *args, **kwargs)

Execute arbitrary function/object remotely.

The object must be picklable, so you can’t use lambdas or functions defined in the REPL (the objects must have an associated module).

Return type:celery.result.AsyncResult
celery.task.is_done(task_id)

Returns True if task with task_id has been executed.

Return type:bool
celery.task.mark_as_done(task_id, result)
Mark task as done (executed).
celery.task.mark_as_failure(task_id, exc)
Mark task as done (executed).

Previous topic

Welcome to Celery’s documentation!

Next topic

Task Result - celery.result

This Page