README.rst 2.7 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110
  1. =======================================================
  2. Example Celery project using a database message queue
  3. =======================================================
  4. Short instructions
  5. ==================
  6. Quick rundown of the tutorial::
  7. 1. Install the `ghettoq`_ plugin.
  8. $ pip install ghettoq
  9. $ celeryinit
  10. 2. Open up two terminals. In the first, run:
  11. $ celeryd --loglevel=INFO
  12. In the second you run the test program:
  13. $ python ./test.py
  14. Voila, you've executed some tasks!
  15. Instructions
  16. ============
  17. This example uses the database as a message queue (commonly called a "ghetto
  18. queue"). Excellent for testing, but not suitable for production
  19. installations.
  20. To try it out you have to install the `GhettoQ`_ package first::
  21. $ pip install ghettoq
  22. This package is an add-on to `Carrot`_; the messaging abstraction celery
  23. uses. The add-on enables the use of databases as message queues. Currently it
  24. supports `Redis`_ and relational databases via the Django ORM.
  25. .. _`ghettoq`: http://pypi.python.org/pypi/ghettoq
  26. .. _`Carrot`: http://pypi.python.org/pypi/carrot
  27. .. _`Redis`: http://code.google.com/p/redis/
  28. The provided `celeryconfig.py` configures the settings used to drive celery.
  29. Next we have to create the database tables by issuing the `celeryinit`
  30. command::
  31. $ celeryinit
  32. We're using SQLite3, so this creates a database file (`celery.db` as
  33. specified in the config file). SQLite is great, but when used in combination
  34. with Django it doesn't handle concurrency well. To protect your program from
  35. lock problems, celeryd will only spawn one worker process. With
  36. other database drivers you can specify as many worker processes as you want.
  37. With the setup done, let's run the worker::
  38. $ celeryd --loglevel=INFO
  39. You should see the worker starting up. As it will continue running in
  40. the foreground, we have to open up another terminal to run our test program::
  41. $ python test.py
  42. The test program simply runs the `add` task, which is a simple task adding
  43. numbers. You can also run the task manually if you want::
  44. >>> from tasks import add
  45. >>> result = add.delay(4, 4)
  46. >>> result.wait()
  47. 8
  48. Using Redis instead
  49. ===================
  50. To use redis instead, you have to configure the following directives in
  51. `celeryconfig.py`::
  52. CARROT_BACKEND = "ghettoq.taproot.Redis"
  53. BROKER_HOST = "localhost"
  54. BROKER_PORT = 6379
  55. Modules
  56. =======
  57. * celeryconfig.py
  58. The celery configuration module.
  59. * tasks.py
  60. Tasks are defined in this module. This module is automatically
  61. imported by the worker because it's listed in
  62. celeryconfig's `CELERY_IMPORTS` directive.
  63. * test.py
  64. Simple test program running tasks.
  65. More information
  66. ================
  67. http://celeryproject.org