Celery v0.3.11 (unstable) documentation

Multiprocessing Worker - celery.worker

celery.worker

class celery.worker.PeriodicWorkController

A thread that continuously checks if there are :class:`celery.task.PeriodicTask`s waiting for execution, and executes them.

Example

>>> PeriodicWorkController().start()
run()
Don’t use run(). use start().
stop()
Shutdown the thread.
class celery.worker.TaskWrapper(task_name, task_id, task_func, args, kwargs, on_acknowledge=None, **opts)

Class wrapping a task to be run.

Parameters:
task_name
Kind of task. Must be a name registered in the task registry.
task_id
UUID of the task.
task_func
The tasks callable object.
args
List of positional arguments to apply to the task.
kwargs
Mapping of keyword arguments to apply to the task.
message
The original message sent. Used for acknowledging the message.
execute(loglevel=None, logfile=None)

Execute the task in a jail() and store return value and status in the task meta backend.

Parameters:
  • loglevel – The loglevel used by the task.
  • logfile – The logfile used by the task.
execute_using_pool(pool, loglevel=None, logfile=None)

Like execute(), but using the multiprocessing pool.

Parameters:
  • pool – A multiprocessing.Pool instance.
  • loglevel – The loglevel used by the task.
  • logfile – The logfile used by the task.

:returns multiprocessing.AsyncResult instance.

extend_with_default_kwargs(loglevel, logfile)

Extend the tasks keyword arguments with standard task arguments.

These are logfile, loglevel, task_id and task_name.

classmethod from_message(message, message_data, logger)

Create a TaskWrapper from a task message sent by celery.messaging.TaskPublisher.

Raises UnknownTask:
 if the message does not describe a task, the message is also rejected.
Returns:TaskWrapper instance.
on_failure(exc_info, meta)
The handler used if the task raised an exception.
on_success(ret_value, meta)
The handler used if the task was successfully processed ( without raising an exception).
exception celery.worker.UnknownTask
Got an unknown task in the queue. The message is requeued and ignored.
class celery.worker.WorkController(concurrency=None, logfile=None, loglevel=None, is_detached=False)

Executes tasks waiting in the task queue.

Parameters:
concurrency
The number of simultaneous processes doing work (default: celery.conf.DAEMON_CONCURRENCY)
loglevel
The loglevel used (default: logging.INFO)
logfile
The logfile used, if no logfile is specified it uses stderr (default: celery.conf.DAEMON_LOG_FILE).
logger
The logging.Logger instance used for logging.
pool
The multiprocessing.Pool instance used.
task_consumer
The celery.messaging.TaskConsumer instance used.
close_connection()
Close the AMQP connection.
connection_diagnostics()
Diagnose the AMQP connection, and reset connection if necessary.
process_task(message_data, message)
Process task message by passing it to the pool of workers.
reset_connection()

Reset the AMQP connection, and reinitialize the celery.messaging.TaskConsumer instance.

Resets the task consumer in task_consumer.

run()
Starts the workers main loop.
shutdown()
Make sure celeryd exits cleanly.
celery.worker.jail(task_id, task_name, func, args, kwargs)

Wraps the task in a jail, which catches all exceptions, and saves the status and result of the task execution to the task meta backend.

If the call was successful, it saves the result to the task result backend, and sets the task status to "DONE".

If the call results in an exception, it saves the exception as the task result, and sets the task status to "FAILURE".

Parameters:
  • task_id – The id of the task.
  • task_name – The name of the task.
  • func – Callable object to execute.
  • args – List of positional args to pass on to the function.
  • kwargs – Keyword arguments mapping to pass on to the function.
Returns:

the function return value on success, or the exception instance on failure.