This is our Tutorial page!

Watch our most recent
YouTube video


In this tutorial, I will show you how to make a Dockerized Django app that uses Celery & Redis production ready.

That's right! I will walk you through every step that is required to get our website LIVE. By the end of this tutorial, our Dockerized Django app will accessible to anyone with an internet connection.

Note: There is a full tutorial video available to set up this Django project from scratch.

Don't want to read this tutorial? Okay, fine. I have put this video together for you.

 

Also, the code for this tutorial can be found in this GitHub repository. Just clone down the project from the did_django_schedule_jobs_v2 repo, into your local development directory using the docker-prod branch:

 

git clone --branch docker-prod git@github.com:bobby-didcoding/did_django_schedule_jobs_v2.git

Okay, let's jump into it...

Syllabus:

This tutorial will focus on the following:

  1. Cloning the GitHub repository
  2. Configuring Django
  3. Configuring Docker
  4. Transfer to virtual machine
  5. Configure virtual machine
  6. Add SSL
  7. Deploy

Prerequisites:

I will be using Gmail to send user confirmation emails. You will need a Google app password. Don't worry, I have you covered! Watch this video to set up a Google app password.

Also, You will need Docker and Docker Compose installed on a virtual machine. I have a good video to help you set one up on Digital Ocean.

We will also be using a managed database service. I have a good video to help you set one up on...your guessed it, Digital Ocean.

One last thing. You will need a domain! I buy most of mine via Interserve and 123 Reg. But others are available.

Setup:

Firstly, Let's get our code.

Navigate to your development directory and create a new project directory. Open Visual Studio Code in your new directory.

The easiest way to do this is to click into the directory address and type 'cmd'.

This will open a new Command Prompt (cmd). To open Visual Studio Code you can use the following command in cmd:

mkdir did_django_schedule_jobs_v2 && cd did_django_schedule_jobs_v2
code .

This will open visual code.

Open a new git terminal in Visual Studio Code and add the following commands:

git clone --branch docker git@github.com:bobby-didcoding/did_django_schedule_jobs_v2.git .

Django setup:

This won't take long. We just need to add a few files.

Now, lets create some create the necessary files.

cd backend/requirements
echo > Prod.txt
cd ..
mkdir media
cd ..

You will notice that we have created a new requirements file called 'Prod'. We will use this new file to specify any production specific requirements. In our case, the production specific requirement is a Python WSGI HTTP Server for UNIX called Gunicorn.

Open backend/requirements/Prod.txt in VS Code and add the following code:

-r Base.txt

gunicorn==20.1.0

Docker setup:

Our project is currently configured to work on our local machine. Therefore, we want to keep our development configuration as it is. Let's go ahead and create a whole bunch of files i.e. Docker and .env files for our new production environment.

Open a new terminal in Visual Studio Code and use the following command:

Create files and dirs:

copy docker-compose.yml docker-compose.prod.yml
copy .env.template .env.prod
echo > .env.prod.db
mkdir nginx && cd nginx && echo > Dockerfile && echo > custom.conf
cd ..
cd backend && copy entrypoint.sh entrypoint.prod.sh && copy Dockerfile Dockerfile.prod
mkdir logs && cd logs && echo > celery.log
cd ..

Configure our production environment files (.env.prod):

As mentioned above, we are using a managed database service with Digital Ocean. If you have followed

We will also be using a managed database service. I have a good this video, you will be able to view your database details on this screen.

 

 

Open the .env.prod.db file and the variables below with your own credentials:

POSTGRES_USER=**DigitalOcean user name**
POSTGRES_PASSWORD=**DigitalOcean password**
POSTGRES_DB=**DigitalOcean db name**

We will also need to change the SQL variables in .env.prod to match our new database. Open .env.prod and change the following variables:

...
SQL_ENGINE=django.db.backends.postgresql_psycopg2
SQL_DATABASE=**DigitalOcean db name**
SQL_USER=**DigitalOcean user name**r
SQL_PASSWORD=**DigitalOcean db name**
SQL_HOST=**DigitalOcean host**
SQL_PORT=**DigitalOcean port**
...

Now update the google email settings to match your own.

DONOT_REPLY_EMAIL = **Add email**
GOOGLE_APP_PASSWORD = **Add app password**

Note: Follow this tutorial to get your app password

And lastly, add/update the following variables to .env.prod. change them to match your own.

...
DEBUG=0
PRODUCTION=1
DJANGO_ALLOWED_HOSTS=**yourdomain.com**
FLOWER_USER=**a username for flower**
FLOWER_PASSWORD=** a password for flower**

Docker Compose

Our current docker-compose.prod.yml file is a copy of our development file. This needs to change as we are now using different .env files and we need to add some nginx settings.

We will also be setting up a custom networks. Read more about networking video to help you set one up on...your guessed it, here.

First, change version from 3.8 to 3.

Then, We will begin with our 'db' service.

Update the db service with the following:


  db:
    image: postgres:13.0-alpine
    volumes:
      - postgres_data:/var/lib/postgresql/data/
    env_file:
      - ./.env.prod.db
    container_name: did_django_schedule_jobs_v2_db_prod
    networks:
      - main_prod

Note: we are now pointing the db service to our new .env.prod.db file

Update the app service with the following:


  app:
    build:
      context: ./backend
      dockerfile: Dockerfile.prod
    restart: always
    command: gunicorn did_django_schedule_jobs_v2.wsgi:application --bind 0.0.0.0:8000
    volumes:
      - static_volume:/home/app/web/staticfiles
      - media_volume:/home/app/web/mediafiles
    expose:
      - 8000
    env_file:
      - ./.env.prod
    depends_on:
      - db
    networks:
      - main_prod
    container_name: did_django_schedule_jobs_v2_django_app_prod

Update the redis service with the following:


  redis:
    image: redis:6-alpine
    expose:
      - 6379
    ports:
      - "6379:6379"
    networks:
      - main_prod
    container_name: did_django_schedule_jobs_v2_redis_prod

Update the celery_worker service with the following:


  celery_worker:
    restart: always
    build:
      context: ./backend
      dockerfile: Dockerfile.prod
    command: celery -A did_django_schedule_jobs_v2 worker --loglevel=info --logfile=logs/celery.log
    volumes:
      - ./backend:/home/app/web/
    networks:
      - main_prod
    env_file:
      - ./.env.prod
    depends_on:
      - db
      - redis
      - app
    container_name: did_django_schedule_jobs_v2_celery_worker_prod

Update the celery_beat service with the following:


  celery-beat:
    build: 
      context: ./backend
      dockerfile: Dockerfile.prod

    command: celery -A did_django_schedule_jobs_v2 beat -l info
    volumes:
      - ./backend:/home/app/web/
    networks:
      - main_prod
    env_file:
      - ./.env.prod
    depends_on:
      - db
      - redis
      - app
    container_name: did_django_schedule_jobs_v2_celery_beat_prod

Update the flower service with the following:


  flower:
    build:
      context: ./backend
      dockerfile: Dockerfile.prod
    command: "celery -A did_django_schedule_jobs_v2 flower  
            --broker=redis://redis:6379//
            --env-file=./.env.prod
            --basic_auth=${FLOWER_USER}:${FLOWER_PASSWORD}"
    ports:
      - 5555:5555
    networks:
      - main_prod
    env_file:
      - ./.env.prod
    depends_on:
      - db
      - app
      - redis
      - celery_worker
    container_name: did_django_schedule_jobs_v2_flower_prod

Note: We have changed container names and env files on all services.

We now need to add an nginx service.

Add the following code under the flower service:


  nginx:
    container_name: did_django_schedule_jobs_v2_nginx
    restart: always
    build: ./nginx

    ports:
      - "8080:8080"
    networks:
      - main_prod
    volumes:
      - static_volume:/home/app/web/staticfiles
      - media_volume:/home/app/web/mediafiles
    depends_on:
      - app

Lastly, we need to update our volumes and add a network. Add the following to below services:


volumes:
  postgres_data:
  static_volume:
  media_volume:

networks:
  main_prod:
    driver: bridge

Nginx configuration:

You will have noticed that we are referencing the nginx directory we created earlier. Lets go ahead and configure them.

Add the following code to nginx/Dockerfile


FROM nginx:1.17.4-alpine
RUN rm /etc/nginx/conf.d/default.conf
ADD custom.conf /etc/nginx/conf.d

Add the following code to nginx/custom.conf

upstream did_django_schedule_jobs_v2 {
  server app:8000;
}

server {

    listen 8080;

    location / {
        proxy_pass http://did_django_schedule_jobs_v2; 
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header Host $host;
        proxy_redirect off;
        client_max_body_size 200M;
    }

    location /static/ {
        alias /home/app/web/staticfiles/;
    }

    location /media/ {
        alias /home/app/web/mediafiles/;
    }

}

Dockerfile & Endpoint configuration:

Add the following code to entrypoint.prod.sh

#!/bin/sh

if [ "$DATABASE" = "postgres" ]
then
    echo "Waiting for postgres..."

    while ! nc -z $SQL_HOST $SQL_PORT; do
      sleep 0.1
    done

    echo "PostgreSQL started"
fi

exec "$@"

Add the following code to Dockerfile.prod

###########
# BUILDER #
###########

FROM python:3.9 as builder

WORKDIR /usr/src/app

ENV PYTHONUNBUFFERED 1
ENV PYTHONDONTWRITEBYTECODE 1

RUN apt-get update && apt-get -y install netcat

COPY . /usr/src/app/
RUN pip wheel --no-cache-dir --no-deps --wheel-dir /usr/src/app/wheels -r /usr/src/app/requirements/Prod.txt
#########
# FINAL #
#########

FROM python:3.9

# create directory for the user
RUN mkdir -p /home/app

# create the appuser user in appuser group
# RUN addgroup -S app && adduser -S app -G app
ARG user=app
ARG group=app
ARG uid=1000
ARG gid=1000
RUN groupadd -g ${gid} ${group} && useradd -u ${uid} -g ${group} -s /bin/sh ${user}

# create the appropriate directories
ENV HOME=/home/app
ENV APP_HOME=/home/app/web
RUN mkdir $APP_HOME
RUN mkdir $APP_HOME/static
WORKDIR $APP_HOME

# install dependencies
RUN apt-get update && apt-get -y install netcat &&  apt-get -y install gettext && apt-get -y install nano

COPY --from=builder /usr/src/app/wheels /wheels
COPY --from=builder /usr/src/app/requirements/Prod.txt .
RUN pip install --upgrade pip
RUN pip install --no-cache /wheels/*

# copy entrypoint-prod.sh
COPY ./entrypoint.prod.sh $APP_HOME

# copy project
COPY . $APP_HOME

# chown all the files to the app user
RUN chown -R ${user}:${group} $HOME

RUN ["chmod", "+x", "/home/app/web/entrypoint.prod.sh"]

That is all the configuration we need! that wasn't too bad, was it?

Transfer to virtual machine

We are now ready to move our project to our production server

Copy our project over to your virtual machine.

Note: You will need to have your virtual machine IP address and have it all set up and ready with Docker & Docker Compose. Watch this video for a step by step guide on how to set your virtual machine up on Digital Ocean.

SSH to your virtual machine and create a project dir:

cd /home/bobby
mkdir docker

You can easily do this by adding the following to your terminal:

scp -r $(pwd)/{backend,nginx,.env.prod,.env.prod.db,docker-compose.prod.yml} root@your_remote_ip:/path/to/project

Configure virtual machine:

This video will hlp you set up a virtual machine on Digital Ocean with Docker and Docker compose. However we also need to install and configure nginx.

SSH to your virtual machine, open a new terminal and use the following commands to install nginx:

sudo apt update
sudo apt install python3-pip python3-dev libpq-dev postgresql-client nginx
snap install certbot --classic

Now set up the firewall:

sudo ufw allow openssh
sudo ufw enable

We can now configure nginx. Use the following to create a new config file:

Note: change the domain to your own

sudo nano /etc/nginx/sites-available/didcoding.uk

Paste the following into the nano and use Ctrl + S to save and Ctrl + X to exit:

server {
	server_name didcoding.uk www.didcoding.uk;
	location / {
		proxy_pass http://localhost:8080;
		proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
		proxy_set_header Host $host;
		proxy_redirect off;
	}
}

We can can now repeat this for our Flower subdomain:

sudo nano /etc/nginx/sites-available/flower.didcoding.uk

Paste the following into the nano and use Ctrl + S to save and Ctrl + X to exit:

server {
	server_name flower.didcoding.uk;
	location / {
		proxy_pass http://localhost:5555;
		proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
		proxy_set_header Host $host;
		proxy_redirect off;
	}
}

We can now link both of our config files to our site:

sudo ln -s /etc/nginx/sites-available/didcoding.uk /etc/nginx/sites-enabled
sudo ln -s /etc/nginx/sites-available/flower.didcoding.uk /etc/nginx/sites-enabled

You can test the nginx config with the following:

sudo nginx -t

We can now add nginx to our firewall:

sudo systemctl restart nginx
sudo ufw delete allow 8000
sudo ufw allow 'Nginx Full'

Deploy with http:

We are now ready to fire up our Dockerized Django porject.

To do this, use the following commands:


docker-compose -f docker-compose.prod.yml up -d --build
docker exec -it did_django_schedule_jobs_v2_django_app_prod bash
python manage.py migrate
python manage.py collectstatic
exit

You will now need to restart the celery beat container as it requires a few database tables to work:


docker restart did_django_schedule_jobs_v2_celery_beat_prod

Add SSL:

Secure a ssl certificate for our domain and sub domain:

Note: make sure the DNS records are all present and correct. You should have an 'A' record for each of the following domains/sub domains.

sudo certbot --nginx -d didcoding.uk -d www.didcoding.uk -d flower.didcoding.uk

Lastly, set up certbot to automatically renew our ssl certificates:

sudo certbot renew --dry-run

Finished :) Let's check that our project is running correctly. Visit your domain in your Web browser. With any luck, your website will be fully functional and have a valid SSL certificate.

Oh ,one last thing...

You can create users and databases on DigitalOcean. However, The following instructions allow you to create users and databases in your command line.

If you followed along with this video you will have a managed PostgeSQL database service ready for production. You can manage the database cluster in DigitalOcean. however, we will do it from our command line.

To begin, grab the Connection Parameters for your cluster by navigating to Databases from the Cloud Control Panel, and clicking into your database. You should see a Connection Details box containing some parameters for your cluster. Note these down.

Connect to our managed database:

psql -U username -h host -p port -d database -set=sslmode=require -W

Add your password when prompted

We can now go ahead and create a production database.

CREATE DATABASE **db_name**;
CREATE USER **user** WITH PASSWORD '**password**';
ALTER ROLE **user** SET client_encoding TO 'utf8';
ALTER ROLE **user** SET default_transaction_isolation TO 'read committed';
ALTER ROLE **user** SET timezone TO 'UTC';
GRANT ALL PRIVILEGES ON DATABASE **db_name** TO **user**;
\q

You can now use your new database and user details in the .env.prod and .env.prod.db files. If you do this, you will need to re-run the following command:

docker-compose -f docker-compose.prod.yml up -d --build
docker exec -it did_django_schedule_jobs_v2_django_app_prod bash
python manage.py migrate
python manage.py collectstatic
exit

You will now need to restart the celery beat container as it requires a few database tables to work:


docker restart did_django_schedule_jobs_v2_celery_beat_prod

Conclusion

We have successfully Dockerised a Django project that uses Celery, Redis and Flower. We have then set up HTTPS and deployed our project using a Digital Ocean droplet.


Start typing and press Enter to search

Did Coding

At Did Coding, we produce easy-to-follow coding tutorial on our YouTube channel.
Please get in touch if you would like to find out more...

Did Demo Logo
Our socials