Django, Celery, Redis and Flower
A step-by-step guide to wiredup Django, Celery, Redis and Flower

Published At
4/21/2020
Reading Time
~ 4 min read
Here I’m assuming you already have your basic Django project setup. And, already know what Celery is? if not, I’ll suggest getting a basic understanding of it here. So let’s just directly jump into the steps.
Please follow the comments to get a basic understanding of the code. And in code meupBackend is a project name.
Please replace that with your project name. 🤘
Install celery into your project. As celery also need a default broker (a solution to send and receive messages, and this comes in the form of separate service called a message broker). Check the list of available brokers: BROKERS. So you can directly install the celery bundle with the broker. Bundles available.
pip install "celery[redis]"pip install "celery[redis]"Once installed. Head to the project folder which contains settings.py and create a new file called celery.py and put
the following code into it.
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'meupBackend.settings')
app = Celery('meupBackend')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'meupBackend.settings')
app = Celery('meupBackend')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
and now head over to __init.py__ of the same folder and put the following code.
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
and now head over to settings.py and insert the following code into respective places.
CELERY_TIMEZONE = TIME_ZONE
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60
CELERY_BROKER_URL = "redis://localhost:6379/0"
CELERY_RESULT_BACKEND = "django-db"
CELERY_ACCEPT_CONTENT = ["application/json"]
CELERY_TASK_SERIALIZER = "json"
CELERY_RESULT_SERIALIZER = "json"
INSTALLED_APPS = [
# others app
'celery',
'django_celery_results',
'django_celery_beat',
]
"""
django_celery_results:
This extension enables you to store Celery task results using the Django ORM.
"""
"""
django_celery_beat:
This extension enables you to store the periodic task schedule in the database.
The periodic tasks can be managed from the Django Admin interface, where you can create, edit and delete periodic tasks and how often they should run.
"""
CELERY_TIMEZONE = TIME_ZONE
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60
CELERY_BROKER_URL = "redis://localhost:6379/0"
CELERY_RESULT_BACKEND = "django-db"
CELERY_ACCEPT_CONTENT = ["application/json"]
CELERY_TASK_SERIALIZER = "json"
CELERY_RESULT_SERIALIZER = "json"
INSTALLED_APPS = [
# others app
'celery',
'django_celery_results',
'django_celery_beat',
]
"""
django_celery_results:
This extension enables you to store Celery task results using the Django ORM.
"""
"""
django_celery_beat:
This extension enables you to store the periodic task schedule in the database.
The periodic tasks can be managed from the Django Admin interface, where you can create, edit and delete periodic tasks and how often they should run.
"""
and now, add a basic task somewhere in your app.
from __future__ import absolute_import
from celery import shared_task
@shared_task
def test(param):
return 'The test task executed with argument "%s" ' % param
from __future__ import absolute_import
from celery import shared_task
@shared_task
def test(param):
return 'The test task executed with argument "%s" ' % param
So, up until now. We’ve done the celery integration with Django. Now, go to your terminal and install redis server.
This is mainly the broker server, what we installed with celery before was a python package which helps us to talk 😅 to
this server.
brew install redisbrew install redisredis-serverredis-serverredis-cli ping
pongredis-cli ping
pongIf you get the pong response, then you’re fine to move forward, can quit the server and close the terminals.
If you’re on Windows and Linux, please check out how you can install the Redis here: https://redis.io/download. Now… run:
python manage.py migratepython manage.py migrateThis will reflect the migrations of django_celery_result and django_celery_beat . Now install the flower with the
following command.
pip install flowerpip install flowerOnce installed. Open 3 terminals and run:
redis-serverredis-serverpython manage.py runserverpython manage.py runservercelery -A meup worker -l INFO ## here `meup` is a project namecelery -A meup worker -l INFO ## here `meup` is a project namecelery -A meup flower ## here `meup` is a project namecelery -A meup flower ## here `meup` is a project nameNow your project will be running on localhost:8000 , Redis will be running on port 6379 , and flower will be running
on localhost:5000.
Please make sure your Redis server is running on a port 6379 or it’ll be showing the port number in the command line
when it got started. So put that port number into you Redis server config into celery configurations file.
🙏
Do you have any questions, or simply wish to contact me privately? Don't hesitate to shoot me a DM on Twitter.
Have a wonderful day.
Abhishek 🙏