Launching asynchronous tasks with Celery

Everything you execute in a view will affect response time. In many situations you might want to return a response to the user as quickly as possible and let the server execute some process asynchronously. This is especially relevant for time-consuming processes or processes subject to failure, which might need a retry policy. For example, a video sharing platform allows users to upload videos but requires a long time to transcode uploaded videos. The site might return a response to the user, telling him the transcoding will start soon, and start transcoding the video asynchronously. Another example is sending e-mails to users. If your site sends e-mail notifications from a view, the SMTP connection might fail or slow down the response. Launching asynchronous tasks is essential to avoid blocking execution.

Celery is a distributed task queue that can process vast amounts of messages. It does real-time processing, but also supports task scheduling. Using Celery, not only can you create asynchronous tasks easily and let them be executed by workers as soon as possible, but you can also schedule them to run at a specific time.

You can find the Celery documentation at http://celery.readthedocs.org/en/latest/.

Installing Celery

Let's install Celery and integrate it into our project. Install Celery via pip using the following command:

pip install celery==3.1.18

Celery requires a message broker in order to handle requests from an external source. The broker takes care of sending messages to Celery workers, which process tasks as they receive them. Let's install a message broker.

Installing RabbitMQ

There are several options to choose as a message broker for Celery, including key-value stores such as Redis or an actual message system such as RabbitMQ. We will configure Celery with RabbitMQ, since it's a recommended message worker for Celery.

If you are using Linux, you can install RabbitMQ from the shell using the following command:

apt-get install rabbitmq

If you need to install RabbitMQ on Mac OS X or Windows, you can find standalone versions at https://www.rabbitmq.com/download.html.

After installing it, launch RabbitMQ using the following command from the shell:

rabbitmq-server

You will see an output that ends with the following line:

Starting broker... completed with 10 plugins.

RabbitMQ is running and ready to receive messages.

Adding Celery to your project

You have to provide a configuration for the Celery instance. Create a new file next to the settings.py file of myshop and name it celery.py. This file will contain the Celery configuration for your project. Add the following code to it:

import os
from celery import Celery
from django.conf import settings

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myshop.settings')

app = Celery('myshop')

app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

In this code we set the DJANGO_SETTINGS_MODULE variable for the Celery command-line program. Then we create an instance of our application with app = Celery('myshop'). We load any custom configuration from our project settings using the config_from_object() method. Finally we tell Celery to auto-discover asynchronous tasks for the applications listed in the INSTALLED_APPS setting. Celery will look for a tasks.py file in each application directory to load asynchronous tasks defined in it.

You need to import the celery module in the __init__.py file of your project to make sure it is loaded when Django starts. Edit the myshop/__init__.py file and add the following code to it:

# import celery
from .celery import app as celery_app

Now, you can start programming asynchronous tasks for your applications.

Note

The CELERY_ALWAYS_EAGER setting allows you to execute tasks locally in a synchronous way instead of sending them to the queue. This is useful for running unit tests or the project in your local environment without running Celery.

Adding asynchronous tasks to your application

We are going to create an asynchronous task to send an email notification to our users when they place an order.

The convention is to include asynchronous tasks for your application in a tasks module within your application directory. Create a new file inside the orders application and name it tasks.py. This is the place where Celery will look for asynchronous tasks. Add the following code to it:

from celery import task
from django.core.mail import send_mail
from .models import Order

@task
def order_created(order_id):
    """
    Task to send an e-mail notification when an order is 
    successfully created.
    """
    order = Order.objects.get(id=order_id)
    subject = 'Order nr. {}'.format(order.id)
    message = 'Dear {},

You have successfully placed an order.
                  Your order id is {}.'.format(order.first_name,
                                            order.id)
    mail_sent = send_mail(subject,
                          message,
                          '[email protected]',
                          [order.email])
    return mail_sent

We define the order_created task by using the task decorator. As you can see, a Celery task is just a Python function decorated with task. Our task function receives an order_id parameter. It's always recommended to pass only IDS to task functions and lookup objects when the task is executed. We use the send_mail() function provided by Django to send an email notification to the user that placed the order. If you don't want to set up email settings, you can tell Django to write emails to the console by adding the following setting to the settings.py file:

EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'

Note

Use asynchronous tasks not only for time-consuming processes, but also for other processes that are subject to failure, which do not take so much time to be executed, but which are subject to connection failures or require a retry policy.

Now we have to add the task to our order_create view. Open the views.py file of the orders application and import the task as follows:

from .tasks import order_created

Then, call the order_created asynchronous task after clearing the cart as follows:

# clear the cart
cart.clear() 
# launch asynchronous task
order_created.delay(order.id)

We call the delay() method of the task to execute it asynchronously. The task will be added to the queue and will be executed by a worker as soon as possible.

Open another shell and start the celery worker, using the following command:

celery -A myshop worker -l info

The Celery worker is now running and ready to process tasks. Make sure the Django development server is also running. Open http://127.0.0.1:8000/ in your browser, add some products to your shopping cart, and complete an order. In the shell, you started the Celery worker and you will see an output similar to this one:

[2015-09-14 19:43:47,526: INFO/MainProcess] Received task: orders.tasks.order_created[933e383c-095e-4cbd-b909-70c07e6a2ddf]
[2015-09-14 19:43:50,851: INFO/MainProcess] Task orders.tasks.order_created[933e383c-095e-4cbd-b909-70c07e6a2ddf] succeeded in 3.318835098994896s: 1

The task has been executed and you will receive an email notification for your order.

Monitoring Celery

You might want to monitor the asynchronous tasks that are executed. Flower is a web-based tool for monitoring Celery. You can install Flower using the command pip install flower

Once installed, you can launch Flower running the following command from your project directory:

celery -A myshop flower

Open http://localhost:5555/dashboard in your browser. You will be able to see the active Celery workers and asynchronous tasks' statistics:

Monitoring Celery

You can find documentation for Flower at http://flower.readthedocs.org/en/latest/.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset