A Quick Guide To Using Celery With Django

Often when reading about Django you hear about "asynchronous tasks" - things that run in the background. You think, "ah that sounds very useful!" and then look into it.

Most results will point to "celery" as the go to software to run the async tasks along with a myriad of configuration options that quickly become confusing. There is also a "message broker" what the hell is that?

I'll be showing you how to quickly setup Celery to run tasks in the background. Along with a "message broker".

Some quick terminology:

Most guides point to using "rabbitmq" as a message broker, I have not used that before so I just use "redis"


I am assuming that you know enough to have a basic django application deployed

Install celery with redis bindings in your virtual environment:

pip install celery[redis]

Install redis using your distributions package manager:

sudo dnf install redis

sudo apt install redis-server

Now start and enable redis on boot:

systemctl enable --now redis

We have now installed everything we need.


In your django application, in the directory that contains settings.py create a new file called celery.py In that file add the following configuration, replacing "YOUR_DJANGO_APPLICATION" with the name of you django application (obviously)

# celery.py
import os

from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'YOUR_DJANGO_APPLICATION.settings')


# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.

Because this is a super quick start guide I am not going into detail on what any of the above does.

In the same directory in the file __init__.py add the following

# __init__.py
from .celery import app as celery_app

__all__ = ('celery_app',)

In settings.py add the following:

# settings.py

# Celery settings
CELERY_BROKER_URL = 'redis://localhost:6379/0'
CELERY_accept_content = ['json']
CELERY_task_serializer = 'json'
CELERY_result_serializer = 'json'

In the django app that you want an async task to run create a file called tasks.py for example analytics/tasks.py

In that file make sure you have the following import:

from celery import shared_task

In here put your async functions. Each async function must be decorated with an @shared_task decorator:

So our tasks.py file should look something like this:

from celery import shared_task

def process_data(data):
    print("I am processing data now!")
    # do something 
    # Everything that you do here will run in the background
    return processed_data

We can call this task from a view. So in our views.py file add the following import:

from .tasks import process_data

Then in the view that we want to use the celery task, add the following code:

def processing(request, data)
    return HttpResponse("<h1> Your data is being processed! </h1>")

Adding .delay to your function call is important. If you don't do this then it will run as a normal function, or "synchronously"

Starting celery

You can start celery like so (again, replacing DJANGO_APPLICATION_NAME with your app name):

celery -A DJANGO_APPLICATION_NAME worker --loglevel=INFO

You should see celery pick up the tasks and connect to redis:

  . APP_NAME.tasks.process_data

[2020-09-30 21:55:55,616: INFO/MainProcess] Connected to redis://localhost:6379/0

Now when you go to the webpage that corresponds to the view code you should see celery output as your function runs.


Right, that was a very quick introduction to celery with the most basic configuration and some examples to help you get up and running as soon as possible.

Of course, please do not run celery in a production environment without understanding it more, read the docs here

You also would want to make sure that your redis installation is secure.

Published on Sept. 30, 2020, 11:47 p.m.

Author: Mark