Browse Source

Always capitalize Celery

Ask Solem 8 years ago
parent
commit
825e6a8708

+ 1 - 1
CONTRIBUTING.rst

@@ -206,7 +206,7 @@ spelling or other errors on the website/docs/code.
        hard to get or might not be that useful. Try to inspect the process to
        hard to get or might not be that useful. Try to inspect the process to
        get more diagnostic data. Some ideas:
        get more diagnostic data. Some ideas:
 
 
-       * Enable celery's ``breakpoint_signal`` and use it
+       * Enable Celery's ``breakpoint_signal`` and use it
          to inspect the process's state. This will allow you to open a
          to inspect the process's state. This will allow you to open a
          ``pdb`` session.
          ``pdb`` session.
        * Collect tracing data using `strace`_(Linux),
        * Collect tracing data using `strace`_(Linux),

+ 2 - 2
README.rst

@@ -1,5 +1,5 @@
 =================================
 =================================
- celery - Distributed Task Queue
+ Celery - Distributed Task Queue
 =================================
 =================================
 
 
 .. image:: http://cloud.github.com/downloads/celery/celery/celery_128.png
 .. image:: http://cloud.github.com/downloads/celery/celery/celery_128.png
@@ -372,7 +372,7 @@ Getting Help
 Mailing list
 Mailing list
 ------------
 ------------
 
 
-For discussions about the usage, development, and future of celery,
+For discussions about the usage, development, and future of Celery,
 please join the `celery-users`_ mailing list.
 please join the `celery-users`_ mailing list.
 
 
 .. _`celery-users`: http://groups.google.com/group/celery-users/
 .. _`celery-users`: http://groups.google.com/group/celery-users/

+ 1 - 1
docs/contributing.rst

@@ -206,7 +206,7 @@ spelling or other errors on the website/docs/code.
        hard to get or might not be that useful. Try to inspect the process to
        hard to get or might not be that useful. Try to inspect the process to
        get more diagnostic data. Some ideas:
        get more diagnostic data. Some ideas:
 
 
-       * Enable celery's :ref:`breakpoint signal <breakpoint_signal>` and use it
+       * Enable Celery's :ref:`breakpoint signal <breakpoint_signal>` and use it
          to inspect the process's state. This will allow you to open a
          to inspect the process's state. This will allow you to open a
          :mod:`pdb` session.
          :mod:`pdb` session.
        * Collect tracing data using `strace`_(Linux),
        * Collect tracing data using `strace`_(Linux),

+ 3 - 3
docs/django/first-steps-with-django.rst

@@ -63,7 +63,7 @@ for the :program:`celery` command-line program:
     os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
     os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
 
 
 You don't need this line, but it saves you from always passing in the
 You don't need this line, but it saves you from always passing in the
-settings module to the celery program. It must always come before
+settings module to the ``celery`` program. It must always come before
 creating the app instances, which is what we do next:
 creating the app instances, which is what we do next:
 
 
 .. code-block:: python
 .. code-block:: python
@@ -155,7 +155,7 @@ To use this with your project you need to follow these four steps:
 
 
 2. Add ``djcelery`` to ``INSTALLED_APPS``.
 2. Add ``djcelery`` to ``INSTALLED_APPS``.
 
 
-3. Create the celery database tables.
+3. Create the Celery database tables.
 
 
     This step will create the tables used to store results
     This step will create the tables used to store results
     when using the database result backend and the tables used
     when using the database result backend and the tables used
@@ -168,7 +168,7 @@ To use this with your project you need to follow these four steps:
 
 
         $ python manage.py migrate djcelery
         $ python manage.py migrate djcelery
 
 
-4. Configure celery to use the :pypi:`django-celery` backend.
+4. Configure Celery to use the :pypi:`django-celery` backend.
 
 
     For the database backend you must use:
     For the database backend you must use:
 
 

+ 1 - 1
docs/faq.rst

@@ -764,7 +764,7 @@ create a new schedule subclass and override
 
 
 .. _faq-task-priorities:
 .. _faq-task-priorities:
 
 
-Does celery support task priorities?
+Does Celery support task priorities?
 ------------------------------------
 ------------------------------------
 
 
 **Answer**: Yes.
 **Answer**: Yes.

+ 1 - 1
docs/getting-started/brokers/rabbitmq.rst

@@ -45,7 +45,7 @@ see `Installing RabbitMQ on macOS`_.
 Setting up RabbitMQ
 Setting up RabbitMQ
 -------------------
 -------------------
 
 
-To use celery we need to create a RabbitMQ user, a virtual host and
+To use Celery we need to create a RabbitMQ user, a virtual host and
 allow that user access to that virtual host:
 allow that user access to that virtual host:
 
 
 .. code-block:: console
 .. code-block:: console

+ 2 - 2
docs/getting-started/first-steps-with-celery.rst

@@ -103,7 +103,7 @@ with standard Python tools like ``pip`` or ``easy_install``:
 Application
 Application
 ===========
 ===========
 
 
-The first thing you need is a Celery instance, which is called the celery
+The first thing you need is a Celery instance, which is called the Celery
 application or just "app" for short. Since this instance is used as
 application or just "app" for short. Since this instance is used as
 the entry-point for everything you want to do in Celery, like creating tasks and
 the entry-point for everything you want to do in Celery, like creating tasks and
 managing workers, it must be possible for other modules to import it.
 managing workers, it must be possible for other modules to import it.
@@ -136,7 +136,7 @@ You defined a single task, called ``add``, which returns the sum of two numbers.
 
 
 .. _celerytut-running-the-worker:
 .. _celerytut-running-the-worker:
 
 
-Running the celery worker server
+Running the Celery worker server
 ================================
 ================================
 
 
 You now run the worker by executing our program with the ``worker``
 You now run the worker by executing our program with the ``worker``

+ 10 - 10
docs/history/changelog-1.0.rst

@@ -129,7 +129,7 @@ Important notes
 
 
     See: http://bit.ly/d5OwMr
     See: http://bit.ly/d5OwMr
 
 
-    This means those who created their celery tables (via ``syncdb`` or
+    This means those who created their Celery tables (via ``syncdb`` or
     ``celeryinit``) with :pypi:`django-picklefield``
     ``celeryinit``) with :pypi:`django-picklefield``
     versions >= 0.1.5 has to alter their tables to
     versions >= 0.1.5 has to alter their tables to
     allow the result field to be `NULL` manually.
     allow the result field to be `NULL` manually.
@@ -1135,7 +1135,7 @@ Important changes
 
 
 * Celery can now be used in pure Python (outside of a Django project).
 * Celery can now be used in pure Python (outside of a Django project).
 
 
-    This means celery is no longer Django specific.
+    This means Celery is no longer Django specific.
 
 
     For more information see the FAQ entry
     For more information see the FAQ entry
     :ref:`faq-is-celery-for-django-only`.
     :ref:`faq-is-celery-for-django-only`.
@@ -1275,7 +1275,7 @@ News
 
 
 * Tested with Django 1.1
 * Tested with Django 1.1
 
 
-* New Tutorial: Creating a click counter using carrot and celery
+* New Tutorial: Creating a click counter using Carrot and Celery
 
 
 * Database entries for periodic tasks are now created at the workers
 * Database entries for periodic tasks are now created at the workers
     start-up instead of for each check (which has been a forgotten TODO/XXX
     start-up instead of for each check (which has been a forgotten TODO/XXX
@@ -1342,7 +1342,7 @@ News
 * Adds eager execution. `celery.execute.apply`|`Task.apply` executes the
 * Adds eager execution. `celery.execute.apply`|`Task.apply` executes the
   function blocking until the task is done, for API compatibility it
   function blocking until the task is done, for API compatibility it
   returns an `celery.result.EagerResult` instance. You can configure
   returns an `celery.result.EagerResult` instance. You can configure
-  celery to always run tasks locally by setting the
+  Celery to always run tasks locally by setting the
   :setting:`CELERY_ALWAYS_EAGER` setting to `True`.
   :setting:`CELERY_ALWAYS_EAGER` setting to `True`.
 
 
 * Now depends on `anyjson`.
 * Now depends on `anyjson`.
@@ -1399,7 +1399,7 @@ News
   by running `python manage.py celerystats`. See
   by running `python manage.py celerystats`. See
   `celery.monitoring` for more information.
   `celery.monitoring` for more information.
 
 
-* The celery daemon can now be supervised (i.e. it is automatically
+* The Celery daemon can now be supervised (i.e. it is automatically
   restarted if it crashes). To use this start the worker with the
   restarted if it crashes). To use this start the worker with the
   --supervised` option (or alternatively `-S`).
   --supervised` option (or alternatively `-S`).
 
 
@@ -1556,11 +1556,11 @@ arguments, so be sure to flush your task queue before you upgrade.
 
 
 * **IMPORTANT** Celery now depends on carrot >= 0.4.1.
 * **IMPORTANT** Celery now depends on carrot >= 0.4.1.
 
 
-* The celery daemon now sends task errors to the registered admin emails.
+* The Celery daemon now sends task errors to the registered admin emails.
   To turn off this feature, set `SEND_CELERY_TASK_ERROR_EMAILS` to
   To turn off this feature, set `SEND_CELERY_TASK_ERROR_EMAILS` to
   `False` in your `settings.py`. Thanks to Grégoire Cachet.
   `False` in your `settings.py`. Thanks to Grégoire Cachet.
 
 
-* You can now run the celery daemon by using `manage.py`:
+* You can now run the Celery daemon by using `manage.py`:
 
 
   .. code-block:: console
   .. code-block:: console
 
 
@@ -1669,7 +1669,7 @@ arguments, so be sure to flush your task queue before you upgrade.
 :release-date: 2009-05-19 04:13 P.M CET
 :release-date: 2009-05-19 04:13 P.M CET
 :release-by: Ask Solem
 :release-by: Ask Solem
 
 
-* The celery daemon was leaking AMQP connections, this should be fixed,
+* The Celery daemon was leaking AMQP connections, this should be fixed,
   if you have any problems with too many files open (like `emfile`
   if you have any problems with too many files open (like `emfile`
   errors in `rabbit.log`, please contact us!
   errors in `rabbit.log`, please contact us!
 
 
@@ -1724,7 +1724,7 @@ arguments, so be sure to flush your task queue before you upgrade.
 
 
 * Added ``dmap()`` and ``dmap_async()``. This works like the
 * Added ``dmap()`` and ``dmap_async()``. This works like the
   `multiprocessing.Pool` versions except they're tasks
   `multiprocessing.Pool` versions except they're tasks
-  distributed to the celery server. Example:
+  distributed to the Celery server. Example:
 
 
     .. code-block:: pycon
     .. code-block:: pycon
 
 
@@ -1840,7 +1840,7 @@ arguments, so be sure to flush your task queue before you upgrade.
 
 
 * Can now check if a task has been executed or not via HTTP.
 * Can now check if a task has been executed or not via HTTP.
 
 
-* You can do this by including the celery `urls.py` into your project,
+* You can do this by including the Celery `urls.py` into your project,
 
 
         >>> url(r'^celery/$', include('celery.urls'))
         >>> url(r'^celery/$', include('celery.urls'))
 
 

+ 3 - 3
docs/history/changelog-2.0.rst

@@ -571,7 +571,7 @@ Backward incompatible changes
     isn't set up. This makes it possible to use `--help` etc., without having a
     isn't set up. This makes it possible to use `--help` etc., without having a
     working configuration.
     working configuration.
 
 
-    Also this makes it possible to use the client side of celery without being
+    Also this makes it possible to use the client side of Celery without being
     configured:
     configured:
 
 
     .. code-block:: pycon
     .. code-block:: pycon
@@ -626,7 +626,7 @@ Backward incompatible changes
     This bug became visible with RabbitMQ 1.8.0, which no longer
     This bug became visible with RabbitMQ 1.8.0, which no longer
     allows conflicting declarations for the auto_delete and durable settings.
     allows conflicting declarations for the auto_delete and durable settings.
 
 
-    If you've already used celery with this backend chances are you
+    If you've already used Celery with this backend chances are you
     have to delete the previous declaration:
     have to delete the previous declaration:
 
 
     .. code-block:: console
     .. code-block:: console
@@ -700,7 +700,7 @@ News
 
 
 * Worker: Standard out/error is now being redirected to the log file.
 * Worker: Standard out/error is now being redirected to the log file.
 
 
-* :pypi:`billiard` has been moved back to the celery repository.
+* :pypi:`billiard` has been moved back to the Celery repository.
 
 
     =====================================  =====================================
     =====================================  =====================================
     **Module name**                        **celery equivalent**
     **Module name**                        **celery equivalent**

+ 1 - 1
docs/history/changelog-2.1.rst

@@ -464,7 +464,7 @@ News
     arguments, this will be used for *all defined loggers*.
     arguments, this will be used for *all defined loggers*.
 
 
     Remember that the worker also redirects stdout and stderr
     Remember that the worker also redirects stdout and stderr
-    to the celery logger, if manually configure logging
+    to the Celery logger, if manually configure logging
     you also need to redirect the standard outs manually:
     you also need to redirect the standard outs manually:
 
 
     .. code-block:: python
     .. code-block:: python

+ 1 - 1
docs/history/changelog-2.3.rst

@@ -282,7 +282,7 @@ News
 
 
 * multi: now supports "pass through" options.
 * multi: now supports "pass through" options.
 
 
-    Pass through options makes it easier to use celery without a
+    Pass through options makes it easier to use Celery without a
     configuration file, or just add last-minute options on the command
     configuration file, or just add last-minute options on the command
     line.
     line.
 
 

+ 1 - 1
docs/history/changelog-2.5.rst

@@ -144,7 +144,7 @@ Fixes
 
 
 - [celery control|inspect] can now be configured on the command-line.
 - [celery control|inspect] can now be configured on the command-line.
 
 
-    Like with the worker it is now possible to configure celery settings
+    Like with the worker it is now possible to configure Celery settings
     on the command-line for celery control|inspect
     on the command-line for celery control|inspect
 
 
     .. code-block:: console
     .. code-block:: console

+ 3 - 3
docs/history/changelog-3.0.rst

@@ -513,7 +513,7 @@ If you're looking for versions prior to 3.0.x you should go to :ref:`history`.
 
 
 - Improved init-scripts for CentOS.
 - Improved init-scripts for CentOS.
 
 
-    - Updated to support celery 3.x conventions.
+    - Updated to support Celery 3.x conventions.
     - Now uses CentOS built-in ``status`` and ``killproc``
     - Now uses CentOS built-in ``status`` and ``killproc``
     - Support for multi-node / multi-pid worker services.
     - Support for multi-node / multi-pid worker services.
     - Standard color-coded CentOS service-init output.
     - Standard color-coded CentOS service-init output.
@@ -1296,7 +1296,7 @@ If you're looking for versions prior to 3.0.x you should go to :ref:`history`.
 
 
 - Eventlet fixed so that the environment is patched as soon as possible.
 - Eventlet fixed so that the environment is patched as soon as possible.
 
 
-- eventlet: Now warns if celery related modules that depends on threads
+- eventlet: Now warns if Celery related modules that depends on threads
   are imported before eventlet is patched.
   are imported before eventlet is patched.
 
 
 - Improved event and camera examples in the monitoring guide.
 - Improved event and camera examples in the monitoring guide.
@@ -1438,7 +1438,7 @@ If you're looking for versions prior to 3.0.x you should go to :ref:`history`.
         app.add_defaults(initialize_config)
         app.add_defaults(initialize_config)
 
 
     which means the same as the above except that it won't happen
     which means the same as the above except that it won't happen
-    until the celery configuration is actually used.
+    until the Celery configuration is actually used.
 
 
     As an example, Celery can lazily use the configuration of a Flask app::
     As an example, Celery can lazily use the configuration of a Flask app::
 
 

+ 2 - 2
docs/history/changelog-3.1.rst

@@ -1044,7 +1044,7 @@ News
 
 
     .. note::
     .. note::
 
 
-        Note that upgrading celery won't update the init-scripts,
+        Note that upgrading Celery won't update the init-scripts,
         instead you need to manually copy the improved versions from the
         instead you need to manually copy the improved versions from the
         source distribution:
         source distribution:
         https://github.com/celery/celery/tree/3.1/extra/generic-init.d
         https://github.com/celery/celery/tree/3.1/extra/generic-init.d
@@ -1108,7 +1108,7 @@ News
 - **App:** Fixed rare bug with ``autodiscover_tasks()`` (*Issue #1797*).
 - **App:** Fixed rare bug with ``autodiscover_tasks()`` (*Issue #1797*).
 
 
 - **Distribution:** The sphinx docs will now always add the parent directory
 - **Distribution:** The sphinx docs will now always add the parent directory
-  to path so that the current celery source code is used as a basis for
+  to path so that the current Celery source code is used as a basis for
   API documentation (*Issue #1782*).
   API documentation (*Issue #1782*).
 
 
 - **Documentation:** :pypi:`supervisor` examples contained an
 - **Documentation:** :pypi:`supervisor` examples contained an

+ 1 - 1
docs/includes/resources.txt

@@ -8,7 +8,7 @@ Getting Help
 Mailing list
 Mailing list
 ------------
 ------------
 
 
-For discussions about the usage, development, and future of celery,
+For discussions about the usage, development, and future of Celery,
 please join the `celery-users`_ mailing list.
 please join the `celery-users`_ mailing list.
 
 
 .. _`celery-users`: http://groups.google.com/group/celery-users/
 .. _`celery-users`: http://groups.google.com/group/celery-users/

+ 3 - 3
docs/internals/guide.rst

@@ -179,7 +179,7 @@ can't co-exist in the same process space, this later posed a problem
 for using Celery with frameworks that doesn't have this limitation.
 for using Celery with frameworks that doesn't have this limitation.
 
 
 Therefore the app concept was introduced. When using apps you use 'celery'
 Therefore the app concept was introduced. When using apps you use 'celery'
-objects instead of importing things from celery sub-modules, this
+objects instead of importing things from Celery sub-modules, this
 (unfortunately) also means that Celery essentially has two API's.
 (unfortunately) also means that Celery essentially has two API's.
 
 
 Here's an example using Celery in single-mode:
 Here's an example using Celery in single-mode:
@@ -239,7 +239,7 @@ Module Overview
 
 
         - app
         - app
 
 
-            Custom celery app instances uses this loader by default.
+            Custom Celery app instances uses this loader by default.
 
 
         - default
         - default
 
 
@@ -299,7 +299,7 @@ Module Overview
 
 
 - celery.utils
 - celery.utils
 
 
-    Utility functions used by the celery code base.
+    Utility functions used by the Celery code base.
     Much of it is there to be compatible across Python versions.
     Much of it is there to be compatible across Python versions.
 
 
 - celery.contrib
 - celery.contrib

+ 1 - 1
docs/reference/celery.rst

@@ -15,7 +15,7 @@ It includes commonly needed things for calling tasks,
 and creating Celery applications.
 and creating Celery applications.
 
 
 ===================== ===================================================
 ===================== ===================================================
-:class:`Celery`       celery application instance
+:class:`Celery`       Celery application instance
 :class:`group`        group tasks together
 :class:`group`        group tasks together
 :class:`chain`        chain tasks together
 :class:`chain`        chain tasks together
 :class:`chord`        chords enable callbacks for groups
 :class:`chord`        chords enable callbacks for groups

+ 1 - 1
docs/sec/CELERYSA-0001.txt

@@ -44,7 +44,7 @@ Systems affected
 ================
 ================
 
 
 Users of Celery versions 2.1, 2.2, 2.3, 2.4 except the recently
 Users of Celery versions 2.1, 2.2, 2.3, 2.4 except the recently
-released 2.2.8, 2.3.4 and 2.4.4, daemonizing the celery programs
+released 2.2.8, 2.3.4 and 2.4.4, daemonizing the Celery programs
 as the root user, using either:
 as the root user, using either:
     1) the --uid or --gid arguments, or
     1) the --uid or --gid arguments, or
     2) the provided generic init-scripts with the environment variables
     2) the provided generic init-scripts with the environment variables

+ 1 - 1
docs/templates/readme.txt

@@ -1,5 +1,5 @@
 =================================
 =================================
- celery - Distributed Task Queue
+ Celery - Distributed Task Queue
 =================================
 =================================
 
 
 .. image:: http://cloud.github.com/downloads/celery/celery/celery_128.png
 .. image:: http://cloud.github.com/downloads/celery/celery/celery_128.png

+ 1 - 1
docs/userguide/application.rst

@@ -25,7 +25,7 @@ Let's create one now:
     <Celery __main__:0x100469fd0>
     <Celery __main__:0x100469fd0>
 
 
 The last line shows the textual representation of the application,
 The last line shows the textual representation of the application,
-which includes the name of the celery class (``Celery``), the name of the
+which includes the name of the app class (``Celery``), the name of the
 current main module (``__main__``), and the memory address of the object
 current main module (``__main__``), and the memory address of the object
 (``0x100469fd0``).
 (``0x100469fd0``).
 
 

+ 4 - 4
docs/userguide/configuration.rst

@@ -26,7 +26,7 @@ It should contain all you need to run a basic Celery set-up.
     ## Broker settings.
     ## Broker settings.
     broker_url = 'amqp://guest:guest@localhost:5672//'
     broker_url = 'amqp://guest:guest@localhost:5672//'
 
 
-    # List of modules to import when celery starts.
+    # List of modules to import when the Celery worker starts.
     imports = ('myapp.tasks',)
     imports = ('myapp.tasks',)
 
 
     ## Using the database to store task state and results.
     ## Using the database to store task state and results.
@@ -597,7 +597,7 @@ Default is to expire after 1 day.
     For the moment this only works with the AMQP, database, cache,
     For the moment this only works with the AMQP, database, cache,
     and Redis backends.
     and Redis backends.
 
 
-    When using the database backend, `celery beat` must be
+    When using the database backend, ``celery beat`` must be
     running for the results to be expired.
     running for the results to be expired.
 
 
 .. setting:: result_cache_max
 .. setting:: result_cache_max
@@ -2131,11 +2131,11 @@ The maximum number of seconds :mod:`~celery.bin.beat` can sleep
 between checking the schedule.
 between checking the schedule.
 
 
 The default for this value is scheduler specific.
 The default for this value is scheduler specific.
-For the default celery beat scheduler the value is 300 (5 minutes),
+For the default Celery beat scheduler the value is 300 (5 minutes),
 but for e.g. the :pypi:`django-celery` database scheduler it's 5 seconds
 but for e.g. the :pypi:`django-celery` database scheduler it's 5 seconds
 because the schedule may be changed externally, and so it must take
 because the schedule may be changed externally, and so it must take
 changes to the schedule into account.
 changes to the schedule into account.
 
 
-Also when running celery beat embedded (:option:`-B <celery worker -B>`)
+Also when running Celery beat embedded (:option:`-B <celery worker -B>`)
 on Jython as a thread the max interval is overridden and set to 1 so
 on Jython as a thread the max interval is overridden and set to 1 so
 that it's possible to shut down in a timely manner.
 that it's possible to shut down in a timely manner.

+ 1 - 1
docs/userguide/signals.rst

@@ -60,7 +60,7 @@ result other keyword parameters (e.g. signal) are passed to all signal
 handlers by default.
 handlers by default.
 
 
 The best practice for signal handlers is to accept arbitrary keyword
 The best practice for signal handlers is to accept arbitrary keyword
-arguments (i.e. ``**kwargs``). That way new celery versions can add additional
+arguments (i.e. ``**kwargs``). That way new Celery versions can add additional
 arguments without breaking user code.
 arguments without breaking user code.
 
 
 .. _signal-ref:
 .. _signal-ref:

+ 1 - 1
docs/userguide/tasks.rst

@@ -71,7 +71,7 @@ these can be specified as arguments to the decorator:
     The task decorator is available on your :class:`@Celery` application instance,
     The task decorator is available on your :class:`@Celery` application instance,
     if you don't know what this is then please read :ref:`first-steps`.
     if you don't know what this is then please read :ref:`first-steps`.
 
 
-    If you're using Django or are still using the "old" module based celery API,
+    If you're using Django or are still using the "old" module based Celery API,
     then you can import the task decorator like this:
     then you can import the task decorator like this:
 
 
     .. code-block:: python
     .. code-block:: python

+ 3 - 3
docs/whatsnew-4.0.rst

@@ -256,7 +256,7 @@ command:
     $ celery upgrade settings proj/settings.py
     $ celery upgrade settings proj/settings.py
 
 
 This command will modify your module in-place to use the new lower-case
 This command will modify your module in-place to use the new lower-case
-names (if you want uppercase with a celery prefix see block below),
+names (if you want uppercase with a "``CELERY``" prefix see block below),
 and save a backup in :file:`proj/settings.py.orig`.
 and save a backup in :file:`proj/settings.py.orig`.
 
 
 .. admonition:: For Django users and others who want to keep uppercase names
 .. admonition:: For Django users and others who want to keep uppercase names
@@ -282,7 +282,7 @@ and save a backup in :file:`proj/settings.py.orig`.
 
 
         app.config_from_object('django.conf:settings', namespace='CELERY')
         app.config_from_object('django.conf:settings', namespace='CELERY')
 
 
-    You can find the most up to date Django celery integration example
+    You can find the most up to date Django Celery integration example
     here: :ref:`django-first-steps`.
     here: :ref:`django-first-steps`.
 
 
     Note that this will also add a prefix to settings that didn't previously
     Note that this will also add a prefix to settings that didn't previously
@@ -883,7 +883,7 @@ Events are now buffered in the worker and sent as a list which reduces
 the overhead required to send monitoring events.
 the overhead required to send monitoring events.
 
 
 For authors of custom event monitors there will be no action
 For authors of custom event monitors there will be no action
-required as long as you're using the Python celery
+required as long as you're using the Python Celery
 helpers (:class:`~@events.Receiver`) to implement your monitor.
 helpers (:class:`~@events.Receiver`) to implement your monitor.
 However, if you're manually receiving event messages you must now account
 However, if you're manually receiving event messages you must now account
 for batched event messages which differ from normal event messages
 for batched event messages which differ from normal event messages