next-steps.rst 5.1 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147
  1. .. _next-steps:
  2. ============
  3. Next Steps
  4. ============
  5. The :ref:`first-steps` guide is intentionally minimal. In this guide
  6. we will demonstrate what Celery offers in more detail, including
  7. how to add Celery support for your application and library.
  8. .. contents::
  9. :local:
  10. :depth: 1
  11. Using Celery in your Application
  12. ================================
  13. .. _project-layout:
  14. Our Project
  15. -----------
  16. Project layout::
  17. proj/__init__.py
  18. /celery.py
  19. /tasks.py
  20. :file:`proj/celery.py`
  21. ~~~~~~~~~~~~~~~~~~~~~~
  22. .. literalinclude:: ../../examples/next-steps/proj/celery.py
  23. :language: python
  24. In this module we created our :class:`@Celery` instance (sometimes
  25. referred to as the *app*). To use Celery within your project
  26. you simply import this instance.
  27. - The ``broker`` argument specifies the URL of the broker to use.
  28. See :ref:`celerytut-broker` for more information.
  29. - The ``backend`` argument specifies the result backend to use,
  30. It's used to keep track of task state and results.
  31. While results are disabled by default we use the amqp backend here
  32. to demonstrate how retrieving the results work, you may want to use
  33. a different backend for your application, as they all have different
  34. strenghts and weaknesses. If you don't need results it's best
  35. to disable them. Results can also be disabled for individual tasks
  36. by setting the ``@task(ignore_result=True)`` option.
  37. See :ref:`celerytut-keeping-results` for more information.
  38. - The ``include`` argument is a list of modules to import when
  39. the worker starts. We need to add our tasks module here so
  40. that the worker is able to find our tasks.
  41. :file:`proj/tasks.py`
  42. ~~~~~~~~~~~~~~~~~~~~~
  43. .. literalinclude:: ../../examples/next-steps/proj/tasks.py
  44. :language: python
  45. Starting the worker
  46. -------------------
  47. The :program:`celery` program can be used to start the worker::
  48. $ celery worker --app=proj -l info
  49. When the worker starts you should see a banner and some messages::
  50. -------------- celery@halcyon.local v2.6.0rc4
  51. ---- **** -----
  52. --- * *** * -- [Configuration]
  53. -- * - **** --- . broker: amqp://guest@localhost:5672//
  54. - ** ---------- . app: __main__:0x1012d8590
  55. - ** ---------- . concurrency: 8 (processes)
  56. - ** ---------- . events: OFF (enable -E to monitor this worker)
  57. - ** ----------
  58. - *** --- * --- [Queues]
  59. -- ******* ---- . celery: exchange:celery(direct) binding:celery
  60. --- ***** -----
  61. [2012-06-08 16:23:51,078: WARNING/MainProcess] celery@halcyon.local has started.
  62. -- The *broker* is the URL you specifed in the broker argument in our ``celery``
  63. module, you can also specify a different broker on the command line by using
  64. the :option:`-b` option.
  65. -- *Concurrency* is the number of multiprocessing worker process used
  66. to process your tasks concurrently, when all of these are busy doing work
  67. new tasks will have to wait for one of the tasks to finish before
  68. it can be processed.
  69. The default concurrency number is the number of CPU's on that machine
  70. (including cores), you can specify a custom number using :option:`-c` option.
  71. There is no recommended value, as the optimal number depends on a number of
  72. factors, but if your tasks are mostly I/O-bound then you can try to increase
  73. it, experimentation has shown that adding more than twice the number
  74. of CPU's is rarely effective, and likely to degrade performance
  75. instead.
  76. Including the default multiprocessing pool, Celery also supports using
  77. Eventlet, Gevent, and threads (see :ref:`concurrency`).
  78. -- *Events* is an option that when enabled causes Celery to send
  79. monitoring messages (events) for actions occurring in the worker.
  80. These can be used by monitor programs like ``celery events``,
  81. celerymon and the Django-Celery admin monitor that you can read
  82. about in the :ref:`Monitoring and Management guide <guide-monitoring>`.
  83. -- *Queues* is the list of queues that the worker will consume
  84. tasks from. The worker can be told to consume from several queues
  85. at once, and this is used to route messages to specific workers
  86. as a means for Quality of Service, separation of concerns,
  87. and emulating priorities, all described in the :ref:`Routing Guide
  88. <guide-routing>`.
  89. You can get a complete list of command line arguments
  90. by passing in the `--help` flag::
  91. $ celery worker --help
  92. These options are described in more detailed in the :ref:`Workers Guide <guide-workers>`.
  93. .. sidebar:: About the :option:`--app` argument
  94. The :option:`--app` argument specifies the Celery app instance to use,
  95. it must be in the form of ``module.path:celery``, where the part before the colon
  96. is the name of the module, and the attribute name comes last.
  97. If a package name is specified instead it will automatically
  98. try to find a ``celery`` module in that package, and if the name
  99. is a module it will try to find a ``celery`` attribute in that module.
  100. This means that these are all equal:
  101. $ celery --app=proj
  102. $ celery --app=proj.celery:
  103. $ celery --app=proj.celery:celery
  104. .. _designing-workflows:
  105. *Canvas*: Designing Workflows
  106. =============================