Working with tasks and task sets.
A periodic task that deletes expired task metadata every day.
This runs the current backend’s celery.backends.base.BaseBackend.cleanup() method.
Execute an arbitrary function or object.
Note You probably want execute_remote() instead, which this is an internal component of.
The object must be pickleable, so you can’t use lambdas or functions defined in the REPL (that is the python shell, or ipython).
Parameters: |
|
---|
A periodic task is a task that behaves like a cron job.
Raises NotImplementedError: | |
---|---|
if the run_every attribute is not defined. |
You have to register the periodic task in the task registry.
Example
>>> from celery.task import tasks, PeriodicTask
>>> from datetime import timedelta
>>> class MyPeriodicTask(PeriodicTask):
... name = "my_periodic_task"
... run_every = timedelta(seconds=30)
...
... def run(self, **kwargs):
... logger = self.get_logger(**kwargs)
... logger.info("Running MyPeriodicTask")
>>> tasks.register(MyPeriodicTask)
A task that can be delayed for execution by the celery daemon.
All subclasses of Task must define the run() method, which is the actual method the celery daemon executes.
The run() method supports both positional, and keyword arguments.
Raises NotImplementedError: | |
---|---|
if the name attribute is not set. |
The resulting class is callable, which if called will apply the run() method.
Examples
This is a simple task just logging a message,
>>> from celery.task import tasks, Task
>>> class MyTask(Task):
... name = "mytask"
...
... def run(self, some_arg=None, **kwargs):
... logger = self.get_logger(**kwargs)
... logger.info("Running MyTask with arg some_arg=%s" %
... some_arg))
... return 42
... tasks.register(MyTask)
You can delay the task using the classmethod delay()...
>>> result = MyTask.delay(some_arg="foo")
>>> result.status # after some time
'DONE'
>>> result.result
42
...or using the delay_task() function, by passing the name of the task.
>>> from celery.task import delay_task
>>> result = delay_task(MyTask.name, some_arg="foo")
Delay this task for execution by the celery daemon(s).
Parameters: |
|
---|---|
Return type: |
See delay_task().
Get a celery task message consumer.
Return type: | celery.messaging.TaskConsumer. |
---|
Please be sure to close the AMQP connection when you’re done with this object. i.e:
>>> consumer = self.get_consumer()
>>> # do something with consumer
>>> consumer.connection.close()
Get process-aware logger object.
Get a celery task message publisher.
Return type: | celery.messaging.TaskPublisher. |
---|
Please be sure to close the AMQP connection when you’re done with this object, i.e:
>>> publisher = self.get_publisher()
>>> # do something with publisher
>>> publisher.connection.close()
A task containing several subtasks, making it possible to track how many, or when all of the tasks has been completed.
Parameters: |
|
---|
Example
>>> from djangofeeds.tasks import RefreshFeedTask >>> taskset = TaskSet(RefreshFeedTask, args=[ ... [], {"feed_url": "http://cnn.com/rss"}, ... [], {"feed_url": "http://bbc.com/rss"}, ... [], {"feed_url": "http://xkcd.com/rss"}])>>> taskset_id, subtask_ids = taskset.run() >>> list_of_return_values = taskset.join()
Iterate over the results returned after calling run().
If any of the tasks raises an exception, the exception will be re-raised.
Gather the results for all of the tasks in the taskset, and return a list with them ordered by the order of which they were called.
Parameter: | timeout – The time in seconds, how long it will wait for results, before the operation times out. |
---|---|
Raises celery.timer.TimeoutError: | |
if timeout is not None and the operation takes longer than timeout seconds. |
If any of the tasks raises an exception, the exception will be reraised by join().
Returns: | list of return values for all tasks in the taskset. |
---|
Distribute processing of the arguments and collect the results asynchronously.
Returns: | celery.result.AsyncResult instance. |
---|
Run all tasks in the taskset.
Returns: | A tuple containing the taskset id, and a list of subtask ids. |
---|---|
Return type: | tuple |
Example
>>> ts = RefreshFeeds([
... ["http://foo.com/rss", {}],
... ["http://bar.com/rss", {}],
... )
>>> taskset_id, subtask_ids = ts.run()
>>> taskset_id
"d2c9b261-8eff-4bfb-8459-1e1b72063514"
>>> subtask_ids
["b4996460-d959-49c8-aeb9-39c530dcde25",
"598d2d18-ab86-45ca-8b4f-0779f5d6a3cb"]
>>> time.sleep(10)
>>> is_done(taskset_id)
True
Delay a task for execution by the celery daemon.
Parameters: |
|
---|---|
Raises celery.registry.NotRegistered: | |
exception if no such task has been registered in the task registry. |
|
Return type: |
Example
>>> r = delay_task("update_record", name="George Constanza", age=32)
>>> r.ready()
True
>>> r.result
"Record was updated"
Discard all waiting tasks.
This will ignore all tasks waiting for execution, and they will be deleted from the messaging server.
Returns: | the number of tasks discarded. |
---|---|
Return type: | int |
Distribute processing of the arguments and collect the results.
Example
>>> from celery.task import map
>>> import operator
>>> dmap(operator.add, [[2, 2], [4, 4], [8, 8]])
[4, 8, 16]
Distribute processing of the arguments and collect the results asynchronously.
Returns: | celery.result.AsyncResult object. |
---|
Example
>>> from celery.task import dmap_async
>>> import operator
>>> presult = dmap_async(operator.add, [[2, 2], [4, 4], [8, 8]])
>>> presult
<AsyncResult: 373550e8-b9a0-4666-bc61-ace01fa4f91d>
>>> presult.status
'DONE'
>>> presult.result
[4, 8, 16]
Execute arbitrary function/object remotely.
Parameters: |
|
---|
The object must be picklable, so you can’t use lambdas or functions defined in the REPL (the objects must have an associated module).
Returns: | class:celery.result.AsyncResult. |
---|
Returns True if task with task_id has been executed.
Return type: | bool |
---|