Browse Source

Merge branch 'runeh/master'

Ask Solem 15 năm trước cách đây
mục cha
commit
b16801d9aa

+ 52 - 35
FAQ

@@ -2,6 +2,17 @@
  Frequently Asked Questions
  Frequently Asked Questions
 ============================
 ============================
 
 
+General
+=======
+
+What kinds of things should I use celery for
+--------------------------------------------
+
+**Answer:** Anything asynchronous.
+
+fixme: long answer with some examples of sensible uses goes here
+
+
 Misconceptions
 Misconceptions
 ==============
 ==============
 
 
@@ -17,7 +28,7 @@ content-type. The default serialization scheme is pickle because it's the most
 used, and it has support for sending complex objects as task arguments.
 used, and it has support for sending complex objects as task arguments.
 
 
 You can set a global default serializer, the default serializer for a
 You can set a global default serializer, the default serializer for a
-particular Task, and even what serializer to use when sending a single task
+particular Task, or even what serializer to use when sending a single task
 instance.
 instance.
 
 
 Is celery for Django only?
 Is celery for Django only?
@@ -25,28 +36,31 @@ Is celery for Django only?
 
 
 **Answer:** No.
 **Answer:** No.
 
 
-While django itself is a dependency, you can still use all of celerys features
-outside of a django project.
+While django itself is a dependency, you can still use all of celery's features
+outside of a django project. fixme: question about removing the dependency
+
+
 
 
 Do I have to use AMQP/RabbitMQ?
 Do I have to use AMQP/RabbitMQ?
 -------------------------------
 -------------------------------
 
 
 **Answer**: No.
 **Answer**: No.
 
 
-You can also use Redis or an SQL database, for instructions see `Using other
+You can also use Redis or an SQL database, see `Using other
 queues`_.
 queues`_.
 
 
 .. _`Using other queues`:
 .. _`Using other queues`:
     http://ask.github.com/celery/tutorials/otherqueues.html
     http://ask.github.com/celery/tutorials/otherqueues.html
 
 
-Redis or a database won't meet up to the standards
-of an AMQP broker. If you have strict reliability requirements you are
-encouraged to use RabbitMQ or another AMQP broker. Redis/database also uses
-pulling, so they are likely to consume more resources. However, if you for
-some reason is not able to use AMQP, feel free to use these alternatives.
+Redis or a database won't perform as well as
+an AMQP broker. If you have strict reliability requirements you are
+encouraged to use RabbitMQ or another AMQP broker. Redis/database also use
+polling, so they are likely to consume more resources. However, if you for
+some reason are not able to use AMQP, feel free to use these alternatives.
 They will probably work fine for most use cases, and note that the above
 They will probably work fine for most use cases, and note that the above
 points are not specific to celery; If using Redis/database as a queue worked
 points are not specific to celery; If using Redis/database as a queue worked
-fine for you before, it probably will now. And you can always upgrade later.
+fine for you before, it probably will now. You can always upgrade later
+if you need to.
 
 
 Is celery multi-lingual?
 Is celery multi-lingual?
 ------------------------
 ------------------------
@@ -61,7 +75,7 @@ messages. There's no other communication involved.
 Also, there's another way to be language indepedent, and that is to use REST
 Also, there's another way to be language indepedent, and that is to use REST
 tasks, instead of your tasks being functions, they're URLs. With this
 tasks, instead of your tasks being functions, they're URLs. With this
 information you can even create simple web servers that enable preloading of
 information you can even create simple web servers that enable preloading of
-code. For more information about REST tasks see: `User Guide: Remote Tasks`_.
+code. See: `User Guide: Remote Tasks`_.
 
 
 .. _`User Guide: Remote Tasks`:
 .. _`User Guide: Remote Tasks`:
     http://ask.github.com/celery/userguide/remote-tasks.html
     http://ask.github.com/celery/userguide/remote-tasks.html
@@ -119,7 +133,7 @@ I'm having ``IntegrityError: Duplicate Key`` errors. Why?
 **Answer:** See `MySQL is throwing deadlock errors, what can I do?`_.
 **Answer:** See `MySQL is throwing deadlock errors, what can I do?`_.
 Thanks to howsthedotcom.
 Thanks to howsthedotcom.
 
 
-Why isn't my tasks processed?
+Why aren't my tasks processed?
 -----------------------------
 -----------------------------
 **Answer:** With RabbitMQ you can see how many consumers are currently
 **Answer:** With RabbitMQ you can see how many consumers are currently
 receiving tasks by running the following command::
 receiving tasks by running the following command::
@@ -137,7 +151,7 @@ wasn't properly shut down.
 
 
 When a message is recieved by a worker the broker waits for it to be
 When a message is recieved by a worker the broker waits for it to be
 acknowledged before marking the message as processed. The broker will not
 acknowledged before marking the message as processed. The broker will not
-re-send that message to another consumer until the consumer is shutdown
+re-send that message to another consumer until the consumer is shut down
 properly.
 properly.
 
 
 If you hit this problem you have to kill all workers manually and restart
 If you hit this problem you have to kill all workers manually and restart
@@ -145,12 +159,15 @@ them::
 
 
     ps auxww | grep celeryd | awk '{print $2}' | xargs kill
     ps auxww | grep celeryd | awk '{print $2}' | xargs kill
 
 
-You might have to wait a while until all workers has finished the work they're
-doing, if it's still hanging after a long time you can kill them by force
+You might have to wait a while until all workers have finished the work they're
+doing. If it's still hanging after a long time you can kill them by force
 with::
 with::
 
 
     ps auxww | grep celeryd | awk '{print $2}' | xargs kill -9
     ps auxww | grep celeryd | awk '{print $2}' | xargs kill -9
 
 
+fixme: killall wont work?
+
+
 Why won't my Task run?
 Why won't my Task run?
 ----------------------
 ----------------------
 
 
@@ -206,9 +223,9 @@ Brokers
 Can I use celery with ActiveMQ/STOMP?
 Can I use celery with ActiveMQ/STOMP?
 -------------------------------------
 -------------------------------------
 
 
-**Answer**: Yes. But this is somewhat experimental for now.
-It is certainly working ok for me in a test configuration, but it has not
-been tested in production like RabbitMQ. If you have any problems with
+**Answer**: Yes, but this is somewhat experimental for now.
+It is working ok in a test configuration, but it has not
+been tested in production like RabbitMQ has. If you have any problems with
 using STOMP and celery, please report the bugs to the issue tracker:
 using STOMP and celery, please report the bugs to the issue tracker:
 
 
     http://github.com/ask/celery/issues/
     http://github.com/ask/celery/issues/
@@ -270,7 +287,7 @@ Use the following specific settings in your ``settings.py``:
 Now you can go on reading the tutorial in the README, ignoring any AMQP
 Now you can go on reading the tutorial in the README, ignoring any AMQP
 specific options. 
 specific options. 
 
 
-Which features are not supported when using STOMP?
+What features are not supported when using STOMP?
 --------------------------------------------------
 --------------------------------------------------
 
 
 This is a (possible incomplete) list of features not available when
 This is a (possible incomplete) list of features not available when
@@ -400,7 +417,7 @@ just specify a custom exchange and exchange type:
 Easy? No? If you're confused about these terms, you should read up on
 Easy? No? If you're confused about these terms, you should read up on
 AMQP and RabbitMQ. It might be hard to grok the concepts of
 AMQP and RabbitMQ. It might be hard to grok the concepts of
 queues, exchanges and routing/binding keys at first, but it's all very simple,
 queues, exchanges and routing/binding keys at first, but it's all very simple,
-I assure you.
+I assure you. fixme: too colloquial perhaps? Maybe add links to docs
 
 
 Can I use celery without Django?
 Can I use celery without Django?
 --------------------------------
 --------------------------------
@@ -408,7 +425,7 @@ Can I use celery without Django?
 **Answer:** Yes.
 **Answer:** Yes.
 
 
 Celery uses something called loaders to read/setup configuration, import
 Celery uses something called loaders to read/setup configuration, import
-modules that registers tasks and to decide what happens when a task is
+modules that register tasks and to decide what happens when a task is
 executed. Currently there are two loaders, the default loader and the Django
 executed. Currently there are two loaders, the default loader and the Django
 loader. If you want to use celery without a Django project, you either have to
 loader. If you want to use celery without a Django project, you either have to
 use the default loader, or write a loader of your own.
 use the default loader, or write a loader of your own.
@@ -416,7 +433,7 @@ use the default loader, or write a loader of your own.
 The rest of this answer describes how to use the default loader.
 The rest of this answer describes how to use the default loader.
 
 
 First of all, installation. You need to get the development version of
 First of all, installation. You need to get the development version of
-celery from github::
+celery from github:: fixme: even in 1.0?
 
 
     $ git clone git://github.com/ask/celery.git
     $ git clone git://github.com/ask/celery.git
     $ cd celery
     $ cd celery
@@ -434,7 +451,7 @@ whatever::
 You need a configuration file named ``celeryconfig.py``, either in the
 You need a configuration file named ``celeryconfig.py``, either in the
 directory you run ``celeryd`` in, or in a Python library path where it is
 directory you run ``celeryd`` in, or in a Python library path where it is
 able to find it. The configuration file can contain any of the settings
 able to find it. The configuration file can contain any of the settings
-described in :mod:`celery.conf`, and in additional if you're using the
+described in :mod:`celery.conf`. In addition; if you're using the
 database backend you have to configure the database. Here is an example
 database backend you have to configure the database. Here is an example
 configuration using the database backend with MySQL:
 configuration using the database backend with MySQL:
 
 
@@ -464,12 +481,12 @@ configuration using the database backend with MySQL:
     # is able to find and run them.
     # is able to find and run them.
     CELERY_IMPORTS = ("mytaskmodule1", "mytaskmodule2")
     CELERY_IMPORTS = ("mytaskmodule1", "mytaskmodule2")
     
     
-Now with this configuration file in the current directory you have to
+With this configuration file in the current directory you have to
 run ``celeryinit`` to create the database tables::
 run ``celeryinit`` to create the database tables::
 
 
     $ celeryinit
     $ celeryinit
 
 
-Then you should be able to successfully run ``celeryd``::
+At this point you should be able to successfully run ``celeryd``::
 
 
     $ celeryd --loglevel=INFO
     $ celeryd --loglevel=INFO
 
 
@@ -484,19 +501,19 @@ and send a task from a python shell (note that it must be able to import
 The celery test-suite is failing
 The celery test-suite is failing
 --------------------------------
 --------------------------------
 
 
-**Answer**: You're running tests from your own Django applicaiton, and celerys
-tests are failing and celerys tests are failing in that context?
+**Answer**: You're running tests from your own Django applicaiton, and celery's
+tests are failing and celery's tests are failing in that context? fixme: I don't get the preceding sentence
 If so, read on for a trick, if not please report the test failure to our issue
 If so, read on for a trick, if not please report the test failure to our issue
-tracker at GitHub.
+tracker on GitHub.
 
 
     http://github.com/ask/celery/issues/
     http://github.com/ask/celery/issues/
 
 
 That Django is running tests for all applications in ``INSTALLED_APPS``
 That Django is running tests for all applications in ``INSTALLED_APPS``
 is a pet peeve of mine. You should use a test runner that either
 is a pet peeve of mine. You should use a test runner that either
 
 
-    1) Explicitly lists the apps you want to run tests for, or
+    1) Explicitly lists the apps you want to run tests for, or:
 
 
-    2) make a test runner that skips tests for apps you don't want to run.
+    2) Make a test runner that skips tests for apps you don't want to run.
 
 
 For example this test runner that celery is using:
 For example this test runner that celery is using:
 
 
@@ -544,12 +561,12 @@ Can I change the interval of a periodic task at runtime?
 Does celery support task priorities?
 Does celery support task priorities?
 ------------------------------------
 ------------------------------------
 
 
-**Answer**: No, or theoretically as AMQP supports priorities but
+**Answer**: No. In theory, yes, as AMQP supports priorities. However
 RabbitMQ doesn't implement them yet.
 RabbitMQ doesn't implement them yet.
 
 
-However the usual way to prioritize work in celery, is to route high priority tasks
-to different servers. In the real world this may actually work better than per. message
-priorities. You can use this in combination with rate limting to achieve a
+The usual way to prioritize work in celery, is to route high priority tasks
+to different servers. In the real world this may actually work better than per message
+priorities. You can use this in combination with rate limiting to achieve a
 highly performant system.
 highly performant system.
 
 
 Can I schedule tasks to execute at a specific time?
 Can I schedule tasks to execute at a specific time?
@@ -561,7 +578,7 @@ Can I schedule tasks to execute at a specific time?
 
 
 However, you can't schedule a periodic task at a specific time yet.
 However, you can't schedule a periodic task at a specific time yet.
 The good news is, if anyone is willing
 The good news is, if anyone is willing
-to implement it, it shouldn't be that hard, some pointers to achieve this has
+to implement it, it shouldn't be that hard. Some pointers to achieve this has
 been written here: http://bit.ly/99UQNO
 been written here: http://bit.ly/99UQNO
 
 
 
 

+ 9 - 9
docs/configuration.rst

@@ -5,9 +5,9 @@
 This document describes the configuration options available.
 This document describes the configuration options available.
 
 
 If you're using celery in a Django project these settings should be defined
 If you're using celery in a Django project these settings should be defined
-in your projects ``settings.py`` file.
+in the project's ``settings.py`` file.
 
 
-In a regular Python environment using the default loader you must create
+In a regular Python environment, that is using the default loader, you must create
 the ``celeryconfig.py`` module and make sure it is available on the
 the ``celeryconfig.py`` module and make sure it is available on the
 Python path.
 Python path.
 
 
@@ -15,8 +15,8 @@ Python path.
 Example configuration file
 Example configuration file
 ==========================
 ==========================
 
 
-This is an example configuration file to get you started,
-it should contain all you need to run a basic celery set-up.
+This is an example configuration file to get you started.
+It should contain all you need to run a basic celery set-up.
 
 
 .. code-block:: python
 .. code-block:: python
 
 
@@ -51,10 +51,10 @@ Concurrency settings
 * CELERYD_PREFETCH_MULTIPLIER
 * CELERYD_PREFETCH_MULTIPLIER
     How many messages to prefetch at a time multiplied by the number of
     How many messages to prefetch at a time multiplied by the number of
     concurrent processes. The default is 4 (four messages for each
     concurrent processes. The default is 4 (four messages for each
-    process). The default setting seems pretty good here, but if you have
+    process). The default setting seems pretty good here. However, if you have
     very long running tasks waiting in the queue and you have to start the
     very long running tasks waiting in the queue and you have to start the
-    workers, make note that the first worker to start will receive four times the
-    number of messages initially, which might not be fairly balanced among the
+    workers, note that the first worker to start will receive four times the
+    number of messages initially. Thus the tasks may not be fairly balanced among the
     workers.
     workers.
 
 
 
 
@@ -83,7 +83,7 @@ Task result backend settings
     * amqp
     * amqp
         Send results back as AMQP messages
         Send results back as AMQP messages
         (**WARNING** While very fast, you must make sure you only
         (**WARNING** While very fast, you must make sure you only
-        try to receive the result once).
+        try to receive the result once). fixme: How? where is this documented?
 
 
 
 
 .. _`memcached`: http://memcached.org
 .. _`memcached`: http://memcached.org
@@ -97,7 +97,7 @@ Database backend settings
 Please see the Django ORM database settings documentation:
 Please see the Django ORM database settings documentation:
 http://docs.djangoproject.com/en/dev/ref/settings/#database-engine
 http://docs.djangoproject.com/en/dev/ref/settings/#database-engine
 
 
-If you use this backend make sure to initialize the database tables
+If you use this backend, make sure to initialize the database tables
 after configuration. When using celery with a Django project this
 after configuration. When using celery with a Django project this
 means executing::
 means executing::
 
 

+ 4 - 3
examples/celery_http_gateway/README.rst

@@ -31,9 +31,10 @@ Then you can use the resulting task-id to get the return value::
     {"task": {"status": "SUCCESS", "result": "pong", "id": "e3a95109-afcd-4e54-a341-16c18fddf64b"}}
     {"task": {"status": "SUCCESS", "result": "pong", "id": "e3a95109-afcd-4e54-a341-16c18fddf64b"}}
 
 
 
 
-If you don't want to expose all tasks, you can extend the apply view to only
-accept an whitelist for example, or just make views for every task you want to
-expose, we made on such view for ping in ``views.ping``::
+If you don't want to expose all tasks there are a few possible
+approaches. For instance you can extend the ``apply`` view to only
+accept a whitelist. Another possibility is to just make views for every task you want to
+expose. We made on such view for ping in ``views.ping``::
 
 
     $ curl http://localhost:8000/ping/
     $ curl http://localhost:8000/ping/
     {"ok": "true", "task_id": "383c902c-ba07-436b-b0f3-ea09cc22107c"}
     {"ok": "true", "task_id": "383c902c-ba07-436b-b0f3-ea09cc22107c"}

+ 8 - 8
examples/ghetto-queue/README.rst

@@ -13,11 +13,11 @@ Quick rundown of the tutorial::
 
 
     $ celeryinit
     $ celeryinit
 
 
-2. Open up two terminals, in the first you run:
+2. Open up two terminals. In the first, run:
 
 
     $ celeryd --loglevel=INFO
     $ celeryd --loglevel=INFO
 
 
-  In the other you run the test program:
+  In the second you run the test program:
 
 
     $ python ./test.py
     $ python ./test.py
 
 
@@ -27,15 +27,15 @@ Instructions
 ============
 ============
 
 
 This example uses the database as a message queue (commonly called a "ghetto
 This example uses the database as a message queue (commonly called a "ghetto
-queue"). Excellent for testing, but not very useful for production
+queue"). Excellent for testing, but not suitable for production
 installations.
 installations.
 
 
 To try it out you have to install the `GhettoQ`_ package first::
 To try it out you have to install the `GhettoQ`_ package first::
 
 
     $ pip install ghettoq
     $ pip install ghettoq
 
 
-This package is an add on to `Carrot`_; the messaging abstraction celery
-uses, that enables the use of databases as message queues. Currently it
+This package is an add-on to `Carrot`_; the messaging abstraction celery
+uses. The add-on enables the use of databases as message queues. Currently it
 supports `Redis`_ and relational databases via the Django ORM.
 supports `Redis`_ and relational databases via the Django ORM.
 
 
 .. _`ghettoq`: http://pypi.python.org/pypi/ghettoq
 .. _`ghettoq`: http://pypi.python.org/pypi/ghettoq
@@ -52,8 +52,8 @@ command::
 
 
 We're using SQLite3, so this creates a database file (``celery.db`` as
 We're using SQLite3, so this creates a database file (``celery.db`` as
 specified in the config file). SQLite is great, but when used in combination
 specified in the config file). SQLite is great, but when used in combination
-with Django it doesn't handle concurrency well, to protect your program from
-lock problems, celeryd will only spawn one worker process. However -- with
+with Django it doesn't handle concurrency well. To protect your program from
+lock problems, celeryd will only spawn one worker process. With
 other database drivers you can specify as many worker processes as you want.
 other database drivers you can specify as many worker processes as you want.
 
 
 
 
@@ -79,7 +79,7 @@ numbers. You can also run the task manually if you want::
 Using Redis instead
 Using Redis instead
 ===================
 ===================
 
 
-To use redis instead you have to configure the following directives in 
+To use redis instead, you have to configure the following directives in 
 ``celeryconfig.py``::
 ``celeryconfig.py``::
 
 
     CARROT_BACKEND = "ghettoq.taproot.Redis"
     CARROT_BACKEND = "ghettoq.taproot.Redis"

+ 4 - 4
examples/httpexample/README.rst

@@ -1,14 +1,14 @@
-=======================
+======================
  Webhook Task Example
  Webhook Task Example
-=======================
+======================
 
 
 This example is a simple Django HTTP service exposing a single task
 This example is a simple Django HTTP service exposing a single task
 multiplying two numbers:
 multiplying two numbers:
 
 
-The multiply http callback task is in ``views.py``, mapped to an url using
+The multiply http callback task is in ``views.py``, mapped to a URL using
 ``urls.py``.
 ``urls.py``.
 
 
-There's no models, so to start it do::
+There are no models, so to start it do::
 
 
     $ python manage.py runserver
     $ python manage.py runserver