Often when reading about Django you hear about "asynchronous tasks" - things that run in the background. You think, "ah that sounds very useful!" and then look into it.
Most results will point to "celery" as the go to software to run the async tasks along with a myriad of configuration options that quickly become confusing. There is also a "message broker" what the hell is that?
I'll be showing you how to quickly setup Celery to run tasks in the background. Along with a "message broker".
Some quick terminology:
- Celery - the software that runs our jobs for us
- Message broker - basically a todo list for celery. Celery looks into whatever we are using for our message broker takes a task from the top and runs it.
Most guides point to using "rabbitmq" as a message broker, I have not used that before so I just use "redis"
I am assuming that you know enough to have a basic django application deployed
Install celery with redis bindings in your virtual environment:
pip install celery[redis]
Install redis using your distributions package manager:
sudo dnf install redis
sudo apt install redis-server
Now start and enable redis on boot:
systemctl enable --now redis
We have now installed everything we need.
In your django application, in the directory that contains
a new file called
celery.py In that file add the following configuration,
replacing "YOUR_DJANGO_APPLICATION" with the name of you django application
from celery import Celery
# set the default Django settings module for the 'celery' program.
app = Celery('YOUR_DJANGO_APPLICATION')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
# Load task modules from all registered Django app configs.
Because this is a super quick start guide I am not going into detail on what any of the above does.
In the same directory in the file
__init__.py add the following
from .celery import app as celery_app
__all__ = ('celery_app',)
settings.py add the following:
# Celery settings
CELERY_BROKER_URL = 'redis://localhost:6379/0'
CELERY_accept_content = ['json']
CELERY_task_serializer = 'json'
CELERY_result_serializer = 'json'
In the django app that you want an async task to run create a file called
tasks.py for example
In that file make sure you have the following import:
from celery import shared_task
In here put your async functions. Each async function must be decorated with an
tasks.py file should look something like this:
from celery import shared_task
print("I am processing data now!")
# do something
# Everything that you do here will run in the background
￼ return processed_data
We can call this task from a view. So in our
views.py file add the following
from .tasks import process_data
Then in the view that we want to use the celery task, add the following code:
def processing(request, data)
return HttpResponse("<h1> Your data is being processed! </h1>")
.delay to your function call is important. If you don't do this then
it will run as a normal function, or "synchronously"
You can start celery like so (again, replacing DJANGO_APPLICATION_NAME with your app name):
celery -A DJANGO_APPLICATION_NAME worker --loglevel=INFO
You should see celery pick up the tasks and connect to redis:
[2020-09-30 21:55:55,616: INFO/MainProcess] Connected to redis://localhost:6379/0
Now when you go to the webpage that corresponds to the view code you should see celery output as your function runs.
Right, that was a very quick introduction to celery with the most basic configuration and some examples to help you get up and running as soon as possible.
Of course, please do not run celery in a production environment without understanding it more, read the docs here
You also would want to make sure that your redis installation is secure.