FAQ 5.3 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154
  1. ============================
  2. Frequently Asked Questions
  3. ============================
  4. Questions
  5. =========
  6. MySQL is throwing deadlock errors, what can I do?
  7. -------------------------------------------------
  8. **Answer:** MySQL has default isolation level set to ``REPEATABLE-READ``,
  9. if you don't really need that, set it to ``READ-COMMITTED``.
  10. You can do that by adding the following to your ``my.cnf``::
  11. [mysqld]
  12. transaction-isolation = READ-COMMITTED
  13. For more information about InnoDBs transaction model see `MySQL - The InnoDB
  14. Transaction Model and Locking`_ in the MySQL user manual.
  15. (Thanks to Honza Kral and Anton Tsigularov for this solution)
  16. .. _`MySQL - The InnoDB Transaction Model and Locking`: http://dev.mysql.com/doc/refman/5.1/en/innodb-transaction-model.html
  17. celeryd is not doing anything, just hanging
  18. --------------------------------------------
  19. **Answer:** See `MySQL is throwing deadlock errors, what can I do?`_.
  20. I'm having ``IntegrityError: Duplicate Key`` errors. Why?
  21. ----------------------------------------------------------
  22. **Answer:** See `MySQL is throwing deadlock errors, what can I do?`_.
  23. Thanks to howsthedotcom.
  24. Why won't my Task run?
  25. ----------------------
  26. **Answer:** Did you register the task in the applications ``tasks.py`` module?
  27. (or in some other module Django loads by default, like ``models.py``?).
  28. Also there might be syntax errors preventing the tasks module being imported.
  29. You can find out if the celery daemon is able to run the task by executing the
  30. task manually:
  31. >>> from myapp.tasks import MyPeriodicTask
  32. >>> MyPeriodicTask.delay()
  33. Watch celery daemons logfile (or output if not running as a daemon), to see
  34. if it's able to find the task, or if some other error is happening.
  35. Why won't my Periodic Task run?
  36. -------------------------------
  37. **Answer:** See `Why won't my Task run?`_.
  38. How do I discard all waiting tasks?
  39. ------------------------------------
  40. **Answer:** Use ``celery.task.discard_all()``, like this:
  41. >>> from celery.task import discard_all
  42. >>> discard_all()
  43. 1753
  44. The number ``1753`` is the number of messages deleted.
  45. You can also start celeryd with the ``--discard`` argument which will
  46. accomplish the same thing.
  47. I've discarded messages, but there are still messages left in the queue?
  48. ------------------------------------------------------------------------
  49. **Answer:** Tasks are acknowledged (removed from the queue) as soon
  50. as they are actually executed. After the worker has received a task, it will
  51. take some time until it is actually executed, especially if there are a lot
  52. of tasks already waiting for execution. Messages that are not acknowledged are
  53. hold on to by the worker until it closes the connection to the broker (AMQP
  54. server). When that connection is closed (e.g because the worker was stopped)
  55. the tasks will be re-sent by the broker to the next available worker (or the
  56. same worker when it has been restarted), so to properly purge the queue of
  57. waiting tasks you have to stop all the workers, and then discard the tasks
  58. using ``discard_all``.
  59. Can I send some tasks to only some servers?
  60. --------------------------------------------
  61. **Answer:** As of now there is only one use-case that works like this,
  62. and that is tasks of type ``A`` can be sent to servers ``x`` and ``y``,
  63. while tasks of type ``B`` can be sent to server ``z``. One server can't
  64. handle more than one routing_key, but this is coming in a later release.
  65. Say you have two servers, ``x``, and ``y`` that handles regular tasks,
  66. and one server ``z``, that only handles feed related tasks, you can use this
  67. configuration:
  68. * Servers ``x`` and ``y``: settings.py:
  69. .. code-block:: python
  70. AMQP_SERVER = "rabbit"
  71. AMQP_PORT = 5678
  72. AMQP_USER = "myapp"
  73. AMQP_PASSWORD = "secret"
  74. AMQP_VHOST = "myapp"
  75. CELERY_AMQP_CONSUMER_QUEUE = "regular_tasks"
  76. CELERY_AMQP_EXCHANGE = "tasks"
  77. CELERY_AMQP_PUBLISHER_ROUTING_KEY = "task.regular"
  78. CELERY_AMQP_CONSUMER_ROUTING_KEY = "task.#"
  79. CELERY_AMQP_EXCHANGE_TYPE = "topic"
  80. * Server ``z``: settings.py:
  81. .. code-block:: python
  82. AMQP_SERVER = "rabbit"
  83. AMQP_PORT = 5678
  84. AMQP_USER = "myapp"
  85. AMQP_PASSWORD = "secret"
  86. AMQP_VHOST = "myapp"
  87. CELERY_AMQP_EXCHANGE = "tasks"
  88. CELERY_AMQP_PUBLISHER_ROUTING_KEY = "task.regular"
  89. CELERY_AMQP_EXCHANGE_TYPE = "topic"
  90. # This is the settings different for this server:
  91. CELERY_AMQP_CONSUMER_QUEUE = "feed_tasks"
  92. CELERY_AMQP_CONSUMER_ROUTING_KEY = "feed.#"
  93. Now to make a Task run on the ``z`` server you need to set its
  94. ``routing_key`` attribute so it starts with the words ``"task.feed."``:
  95. .. code-block:: python
  96. from feedaggregator.models import Feed
  97. from celery.task import Task
  98. class FeedImportTask(Task):
  99. name = "import_feed"
  100. routing_key = "feed.importer"
  101. def run(self, feed_url):
  102. # something importing the feed
  103. Feed.objects.import_feed(feed_url)
  104. You can also override this using the ``routing_key`` argument to
  105. :func:`celery.task.apply_async`:
  106. >>> from celery.task import apply_async
  107. >>> from myapp.tasks import RefreshFeedTask
  108. >>> apply_async(RefreshFeedTask, args=["http://cnn.com/rss"],
  109. ... routing_key="feed.importer")