123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543 |
- .. _next-steps:
- ============
- Next Steps
- ============
- The :ref:`first-steps` guide is intentionally minimal. In this guide
- we will demonstrate what Celery offers in more detail, including
- how to add Celery support for your application and library.
- .. contents::
- :local:
- :depth: 1
- Using Celery in your Application
- ================================
- .. _project-layout:
- Our Project
- Project layout::
- proj/__init__.py
- /celery.py
- /tasks.py
- :file:`proj/celery.py`
- ~~~~~~~~~~~~~~~~~~~~~~
- .. literalinclude:: ../../examples/next-steps/proj/celery.py
- :language: python
- In this module we created our :class:`@Celery` instance (sometimes
- referred to as the *app*). To use Celery within your project
- you simply import this instance.
- - The ``broker`` argument specifies the URL of the broker to use.
- See :ref:`celerytut-broker` for more information.
- - The ``backend`` argument specifies the result backend to use,
- It's used to keep track of task state and results.
- While results are disabled by default we use the amqp backend here
- to demonstrate how retrieving the results work, you may want to use
- a different backend for your application, as they all have different
- strenghts and weaknesses. If you don't need results it's best
- to disable them. Results can also be disabled for individual tasks
- by setting the ``@task(ignore_result=True)`` option.
- See :ref:`celerytut-keeping-results` for more information.
- - The ``include`` argument is a list of modules to import when
- the worker starts. We need to add our tasks module here so
- that the worker is able to find our tasks.
- :file:`proj/tasks.py`
- ~~~~~~~~~~~~~~~~~~~~~
- .. literalinclude:: ../../examples/next-steps/proj/tasks.py
- :language: python
- Starting the worker
- The :program:`celery` program can be used to start the worker::
- $ celery worker
- When the worker starts you should see a banner and some messages::
-
-
-
-
- - **
- - **
- - **
- - **
- - ***
-
-
- [2012-06-08 16:23:51,078: WARNING/MainProcess] celery@halcyon.local has started.
- module, you can also specify a different broker on the command line by using
- the :option:`-b` option.
- to process your tasks concurrently, when all of these are busy doing work
- new tasks will have to wait for one of the tasks to finish before
- it can be processed.
- The default concurrency number is the number of CPU's on that machine
- (including cores), you can specify a custom number using :option:`-c` option.
- There is no recommended value, as the optimal number depends on a number of
- factors, but if your tasks are mostly I/O-bound then you can try to increase
- it, experimentation has shown that adding more than twice the number
- of CPU's is rarely effective, and likely to degrade performance
- instead.
- Including the default multiprocessing pool, Celery also supports using
- Eventlet, Gevent, and threads (see :ref:`concurrency`).
- monitoring messages (events) for actions occurring in the worker.
- These can be used by monitor programs like ``celery events``,
- celerymon and the Django-Celery admin monitor that you can read
- about in the :ref:`Monitoring and Management guide <guide-monitoring>`.
- tasks from. The worker can be told to consume from several queues
- at once, and this is used to route messages to specific workers
- as a means for Quality of Service, separation of concerns,
- and emulating priorities, all described in the :ref:`Routing Guide
- <guide-routing>`.
- You can get a complete list of command line arguments
- by passing in the `
- $ celery worker
- These options are described in more detailed in the :ref:`Workers Guide <guide-workers>`.
- .. sidebar:: About the :option:`
- The :option:`
- it must be in the form of ``module.path:celery``, where the part before the colon
- is the name of the module, and the attribute name comes last.
- If a package name is specified instead it will automatically
- try to find a ``celery`` module in that package, and if the name
- is a module it will try to find a ``celery`` attribute in that module.
- This means that these are all equal:
- $ celery
- $ celery
- $ celery
- .. _designing-workflows:
- *Canvas*: Designing Workflows
- =============================
- A :func:`~celery.subtask` wraps the signature of a single task invocation:
- arguments, keyword arguments and execution options.
- A subtask for the ``add`` task can be created like this::
- >>> from celery import subtask
- >>> subtask(add.name, args=(4, 4))
- or you can create one from the task itself::
- >>> from proj.tasks import add
- >>> add.subtask(args=(4, 4))
- It takes the same arguments as the :meth:`~@Task.apply_async` method::
- >>> add.apply_async(args, kwargs, **options)
- >>> add.subtask(args, kwargs, **options)
- >>> add.apply_async((2, 2), countdown=1)
- >>> add.subtask((2, 2), countdown=1)
- And like there is a :meth:`~@Task.delay` shortcut for `apply_async`
- there is an :meth:`~@Task.s` shortcut for subtask::
- >>> add.s
|