👋
Welcome to my blog!

Django, Celery, Redis and Flower

A step-by-step guide to wiredup Django, Celery, Redis and Flower

Django, Celery, Redis and Flower
Django
DevOps

Published At

4/21/2020

Reading Time

~ 4 min read

Here I’m assuming you already have your basic Django project setup. And, already know what Celery is? if not, I’ll suggest getting a basic understanding of it here. So let’s just directly jump into the steps.

Please follow the comments to get a basic understanding of the code. And in code meupBackend is a project name. Please replace that with your project name. 🤘

Install celery into your project. As celery also need a default broker (a solution to send and receive messages, and this comes in the form of separate service called a message broker). Check the list of available brokers: BROKERS. So you can directly install the celery bundle with the broker. Bundles available.

shellpip install "celery[redis]"
shellpip install "celery[redis]"

Once installed. Head to the project folder which contains settings.py and create a new file called celery.py and put the following code into it.

celery.py
pythonimport os
from celery import Celery
 
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'meupBackend.settings')
 
app = Celery('meupBackend', backend='redis', broker='redis://localhost:6379')
 
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
 
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
 
@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))
 
celery.py
pythonimport os
from celery import Celery
 
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'meupBackend.settings')
 
app = Celery('meupBackend', backend='redis', broker='redis://localhost:6379')
 
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
 
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
 
@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))
 

and now head over to __init.py__ of the same folder and put the following code.

__init__.py
pythonfrom __future__ import absolute_import, unicode_literals
 
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
 
__all__ = ('celery_app',)
 
__init__.py
pythonfrom __future__ import absolute_import, unicode_literals
 
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
 
__all__ = ('celery_app',)
 

and now head over to settings.py and insert the following code into respective places.

settings.py
pythonBROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
 
INSTALLED_APPS = [
    # others app
    'celery',
    'django_celery_results',
    'django_celery_beat',
]
 
"""
django_celery_results:
This extension enables you to store Celery task results using the Django ORM.
"""
 
"""
django_celery_beat:
This extension enables you to store the periodic task schedule in the database.
The periodic tasks can be managed from the Django Admin interface, where you can create, edit and delete periodic tasks and how often they should run.
"""
 
settings.py
pythonBROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
 
INSTALLED_APPS = [
    # others app
    'celery',
    'django_celery_results',
    'django_celery_beat',
]
 
"""
django_celery_results:
This extension enables you to store Celery task results using the Django ORM.
"""
 
"""
django_celery_beat:
This extension enables you to store the periodic task schedule in the database.
The periodic tasks can be managed from the Django Admin interface, where you can create, edit and delete periodic tasks and how often they should run.
"""
 

and now, add a basic task somewhere in your app.

tasks.py
python 
from __future__ import absolute_import
from celery import shared_task
 
@shared_task
def test(param):
    return 'The test task executed with argument "%s" ' % param
 
tasks.py
python 
from __future__ import absolute_import
from celery import shared_task
 
@shared_task
def test(param):
    return 'The test task executed with argument "%s" ' % param
 

So, up until now. We’ve done the celery integration with Django. Now, go to your terminal and install redis server. This is mainly the broker server, what we installed with celery before was a python package which helps us to talk 😅 to this server.

terminal-1
shellbrew install redis
terminal-1
shellbrew install redis
terminal-1
shellredis-server
terminal-1
shellredis-server
terminal-2
shellredis-cli ping
pong
terminal-2
shellredis-cli ping
pong

If you get the pong response, then you’re fine to move forward, can quit the server and close the terminals.

If you’re on Windows and Linux, please check out how you can install the Redis here: https://redis.io/download. Now… run:

shellpython manage.py migrate
shellpython manage.py migrate

This will reflect the migrations of django_celery_result and django_celery_beat . Now install the flower with the following command.

shellpip install flower
shellpip install flower

Once installed. Open 3 terminals and run:

terminal-1
shellredis-server
terminal-1
shellredis-server
terminal-2
shellpython manage.py runserver
terminal-2
shellpython manage.py runserver
terminal-3
shellflower -A meup ## here `meup` is a project name
terminal-3
shellflower -A meup ## here `meup` is a project name

Now your project will be running on localhost:8000 , Redis will be running on port 6379 , and flower will be running on localhost:5000.

Please make sure your Redis server is running on a port 6379 or it’ll be showing the port number in the command line when it got started. So put that port number into you Redis server config into celery configurations file.

🙏

Do you have any questions, or simply wish to contact me privately? Don't hesitate to shoot me a DM on Twitter.

Have a wonderful day.
Abhishek 🙏

Join My Exclusive Newsletter Community

Step into a world where creativity intersects with technology. By subscribing, you'll get a front-row seat to my latest musings, full-stack development resources, and exclusive previews of future posts. Each email is a crafted experience that includes:

  • In-depth looks at my covert projects and musings to ignite your imagination.
  • Handpicked frontend development resources and current explorations, aimed at expanding your developer toolkit.
  • A monthly infusion of inspiration with my personal selection of quotes, books, and music.

Embrace the confluence of words and wonder, curated thoughtfully and sent straight to your inbox.

No fluff. Just the highest caliber of ideas.