Blog Image

How to Start a Professional Django Web Application (Performance & Security)

 Ouhadj ilyes

Software Engineer

Web Development


Feb. 17, 2023

Updated on

You will find most of the online tutorials and even the django docs teaching the easiest and simplest way of starting a django project, while it makes sense if you are someone new to django, but you can't rely on that project structure, configs and tools if you expect your web application to scale for any number of users without any decrease in performance as well as stand against security breaches and threats.

In this article, I will just cover the general approach of starting a django project with the proper tools and configs for high-performance and security, because one article is far from being enough to cover everything but I will provide some resources at the end to where to go next to learn more about building advanced django projects.

Let's get right into it !


If you have never started a Django project before, you may not feel comfortable reading this article, you should have at least started a few django projects and used Virtual Environments and the PIP Package Manager. If you haven't, I recommend starting from the Django Docs

The Optimal Django Environment Setup

This chapter describes what we consider the best local environment setup for intermediate and advanced developers working with Django.

Virtual Environments

There are many areas of software development that are hotly debated, but using virtual environments for Django development is not one. You should use a dedicated virtual environment for each new Django project.

In this article we will use Pipenv to manage virtual environments and install django packages. Pipenv is similar to npm and yarn from the JavaScript/Node ecosystem: it creates a Pipfile containing software dependencies and a Pipfile.lock for ensuring deterministic builds. “Determinism” means that each and every time you download the software in a new virtual environment, you will have exactly the same configuration.

Install Pipenv:

$ pip install pipenv


Docker is a way to isolate an entire operating system via Linux containers which are a type of virtualization. This technology is what makes it possible to quickly add or remove servers from a cloud provider. It's largely software behind the scenes, not actual hardware being changed.

For most applications (especially web applications) a virtual machine provides far more resources than are needed and a container is more than sufficient. Docker is: a way to implement Linux containers!

Containers vs. Virtual Environments

Virtual Environments are a way to isolate Python packages. They allow one computer to run multiple projects locally. For example, Project A might use Python 3.4 and Django 1.11 among other dependencies; whereas Project B uses Python 3.7 and Django 2.2. By configuring a dedicated virtual environment for each project we can manage these different software packages while not polluting our global environment.

The important distinction between virtual environments and Docker is that virtual environments can only isolate Python packages. They cannot isolate non-Python software like a PostgreSQL or MySQL database. And they still rely on a global, system-level installation of Python (in other words, on your computer). The virtual environment points to an existing Python installation; it does not contain Python itself.

containers go a step further and isolate the entire operating system, not just the Python parts. In other words, we will install Python itself within Docker as well as install and run a production-level database.

Enough theory, let's install docker and start a Django project together!

Install Docker

The first step is to sign up for a free account on Docker Hub and then install the Docker desktop app on your local machine:

Once Docker is done installing we can confirm the correct version is running by typing the command docker --version on the command line.

$ docker --version
Docker version 20.10.22, build 6a30dfc

We need an additional Docker tool that helps automate commands which is Docker Compose. This tool is included with Mac and Windows downloads but if you are on Linux you will need to add it manually. You can do this by running the following command after your Docker installation is complete:

$ sudo pip install docker-compose

Starting A Django Project

Now let's start a Django project that runs locally on our computer and show how Pipenv works and then move it entirely within Docker so you can see how all the pieces fit together.

Let's first navigate to Desktop and create a root directory for all of our Django projects we create in this article:

$ cd ~/Desktop
$ mkdir code && cd code

Then create a hello directory for this example project and install Django using Pipenv which creates both a Pipfile and a Pipfile.lock file. Activate the virtual environment with the shell command.

$ mkdir hello && cd hello
$ pipenv install django==4.1.5
$ pipenv shell
(hello) $

Now we can use the startproject command to create a new Django project called hello_project. Adding a period, ., at the end of the command is an optional step but without the period Django adds an additional directory to the project; with the period it does not.

Finally use the migrate command to initialize the database and start the local web server with the runserver command.

(hello) $ django-admin startproject hello_project .
(hello) $ python migrate
(hello) $ python runserver

Assuming everything worked correctly you should now be able to navigate to see the Django Welcome page at in your web browser.

That's all you need to do to start a django project using Pipenv in your local computer. Now it's time to switch to Docker and start running our current project in a container instead!

Stop the local server with Control+c and exit our virtual environment since we no longer need it by typing exit.

(hello) $ exit

That means we're fully out of the virtual environment and ready for Docker.

Images, Containers, and the Docker Host

A Docker image is a snapshot in time of what a project contains. It is represented by a Dockerfile and is literally a list of instructions that must be built. A Docker container is a running instance of an image.

The third core concept is the Docker host which is the underlying OS. It's possible to have multiple containers running within a single Docker host. When we refer to code or processes running within Docker, that means they are running in the Docker host.

To see all of this theory in action, create a Dockerfile in the root directory of your project (hello directory if you are following with me)

$ touch Dockerfile

Within the Dockerfile add the following code which we'll walk through line-by-line below.

# Pull base image
FROM python:3.10
# Set environment variables

# Set work directory

# Install dependencies
COPY Pipfile Pipfile.lock /code/
RUN pip install pipenv && pipenv install --system

# Copy project
COPY . /code/

Dockerfiles are read from top-to-bottom when an image is created. The first instruction must be the FROM command which lets us import a base image to use for our image, in this case Python 3.10

Then we use the ENV command to set two environment variables:

  • PYTHONUNBUFFERED ensures our console output looks familiar and is not buffered by Docker, which we don't want
  • PYTHONDONTWRITEBYTECODE means Python will not try to write .pyc files which we also do not desire

Next we use WORKDIR to set a default work directory path within our image called code which is where we will store our code. If we didn't do this then each time we wanted to execute commands within our container we'd have to type in a long path. Instead Docker will just assume we mean to execute all commands from this directory.

For our dependencies we are using Pipenv so we copy over both the Pipfile and Pipfile.lock files into a /code/ directory in Docker.

Moving along we use the RUN command to first install Pipenv and then pipenv install to install the software packages listed in our Pipfile.lock, currently just Django. It's important to add the --system flag as well since by default Pipenv will look for a virtual environment in which to install any package, but since we're within Docker now, technically there isn't any virtual environment. In a way, the Docker container is our virtual environment and more. So we must use the --system flag to ensure our packages are available throughout all of Docker for us.

As the final step we copy over the rest of our local code into the /code/ directory within Docker. Why do we copy local code over twice, first the Pipfile and Pipfile.lock and then the rest? The reason is that images are created based on instructions top-down so we want things that change often-like our local code-to be last.

Our image instructions are now done so let's build the image using the command docker build . The period, ., indicates the current directory is where to execute the command. There will be a lot of output here; I've only included the first two lines and the last three.

$ docker build .
Sending build context to Docker daemon 154.1kB
Step 1/7 : FROM python:3.7
Step 7/7 : COPY . /code/
---> a48b2acb1fcc
Successfully built a48b2acb1fcc

Moving on we now need to create a docker-compose.yml file to control how to run the container that will be built based upon our Dockerfile image.

$ touch docker-compose.yml

It will contain the following code.

version: '3.8'
        build: .
        command: python /code/ runserver
            - .:/code
            - 8000:8000

On the top line we specify the most recent version of Docker Compose which is currently 3.8!

Then we specify which services (or containers) we want to have running within our Docker host. It's possible to have multiple services running, but for now we just have one for web, we'll add another container later on when configuring our Database (PostgreSQL).

Multiple docker containers will come in handy when you need multiple servers to run at the same time, you need a container for your local web server and another for the Database in most cases, but sometimes you may need a caching system or have background processes in your django project for instance, and in that case, you'll definitely need a seperate container to run a Redis server for example, and with only one command, you can get all of your servers (containers) up and running!

We specify how to build the container by saying, Look in the current directory . for the Dockerfile. Then within the container run the command to start up the local server.

The volumes mount automatically syncs the Docker filesystem with our local computer's filesystem. This means that we don't have to rebuild the image each time we change a single file!

Lastly we specify the ports to expose within Docker which will be 8000, which is the Django default.

If this is your first time using Docker, it is highly likely you are confused right now. Don't worry you just need some practice and the flow will start to make more sense.

The final step is to start our Docker container using the command docker-compose up. This command will result in another long stream of output code on the command line.

$ docker-compose up
Creating network "hello_default" with the default driver
Building web
Step 1/7 : FROM python:3.7
Creating hello_web_1 ... done
Attaching to hello_web_1
web_1 | Performing system checks...
web_1 |
web_1 | System check identified no issues (0 silenced).
web_1 | September 20, 2019 - 17:21:57
web_1 | Django version 2.2.5, using settings 'hello_project.settings'
web_1 | Starting development server at
web_1 | Quit the server with CONTROL-C.

To confirm it actually worked, go back to in your web browser and refresh the page.

Django is now running purely within a Docker container. We are not working within a virtual environment locally. We did not execute the runserver command. All of our code now exists and our Django server is running within a self-contained Docker container.

Stop the container with Control+c (press the “Control” and “c” button at the same time) and additionally type docker-compose down. Docker containers take up a lot of memory so it's a good idea to stop them in this way when you're done using them. Containers are meant to be stateless which is why we use volumes to copy our code over locally where it can be saved.

$ docker-compose down
Removing hello_web_1 ... done
Removing network hello_default

That's all there is to it. In the next section we'll add PostgreSQL in a separate container as our database.


One of the most immediate differences between working on a “toy app” in Django and a production-ready one is the database. Also make sure to Always Use the Same Database Engine Everywhere. A common developer pitfall is using SQLite3 for local development and PostgreSQL (or MySQL) in production. This section applies not only to the SQLite3/PostgreSQL scenario but to any scenario where you're using two different databases and expecting them to behave identically.

Here are some of the issues you may very likely encounter when using different database engines for development and production:

  • You Can't Examine an Exact Copy of Production Data Locally : Sure, you can generate an SQL dump from production and import it into your local database, but that doesn't mean that you have an exact copy after the export and import.

  • Different Databases Have Different Field Types/Constraints: Keep in mind that different databases handle type casting of field data differently. Django’s ORM attempts to accommodate those differences, but there's only so much that it can do.
  • Fixtures Are Not a Magic Solution: Fixtures are not a reliable tool for migrating large data sets from one database to another in a database-agnostic way. They are simply not meant to be used that way. Don't mistake the ability of fixtures to create basic data (dumpdata/loaddata) with the capability to migrate production data between database tools.

We'll be using PostgreSQL in this article as it is the most popular choice for Django developers, however, the beauty of Django's ORM is that even if we wanted to use MySQL or Oracle, the actual Django code we write will be almost identical. The Django ORM handles the translation from Python code to the databases for us which is quite amazing if you think about it.

The challenge of using these three databases is that each must be both installed and run locally if you want to faithfully mimic a production environment on your local computer.

In this section we'll start a new Django project with a SQLite database and then switch over to both Docker and PostgreSQL.

On the command line make sure you've navigated back to the code folder on our desktop. You can do this two ways. Either type cd .. to move “up” a level so if you are currently in Desktop/code/hello you will move to Desktop/code. Or you can simply type cd ∼/Desktop/code/ which will take you directly to the desired directory. Then create a new directory called postgresql for this chapter's code.

$ cd ..
$ mkdir postgresql && cd postgresql

Now install Django, start and activate the virtual environment, and create a basic Django project called config. Don't forget the period . at the end of the command!

$ pipenv install django==4.1.5
$ pipenv shell
(postgresql) $ django-admin startproject config .

So far so good. Here developers usually run the migrate command to initialize the database. That's a bad practice and it's not recommended until after a custom user model has been configured. Otherwise Django will bind the database to the built-in User model which is difficult to modify later on in the project.

We'll configure a custom user model later on and initialize the database! Now let's start the local server:

(postgresql) $ python runserver

Navigate to to make sure everything worked! Finally, stop the local server with Control+c and then use the ls command to list all files and directories.

(postresql) $ ls
Pipfile  Pipfile.lock  db.sqlite3  config


To switch over to Docker first exit our virtual environment and then create Dockerfile and docker-compose.yml files which will control our Docker image and container respectively.

(postgresql) $ exit
$ touch Dockerfile
$ touch docker-compose.yml

Dockerfile and docker-compose.yml are both the same as the previous project, so just copy-paste their code to their respective files.

Go ahead and build the initial image from Dockerfile now using the docker build . command.

We'll start up our container now but this time in detached mode which requires either the -d or -detach flag (they do the same thing).

$ docker-compose up -d

Detached mode runs containers in the background, which means we can use a single command line tab without needing a separate one open as well. This saves us from switching back and forth between two command line tabs constantly.

To take a look at the logs as they are not showing up in detached mode, just type docker-compose logs

You likely will see a “Warning: Image for service web was built because it did not already exist” message at the bottom of the command. Docker automatically created a new image for us within the container. As we'll see later in the book, adding the --build flag to force an image build is necessary when software packages are updated because, by default, Docker will look for a local cached copy of software and use that which improves performance.

To confirm things are working properly go back to in your web browser. Refresh the page to see the Django welcome page again.

So far we've been updating our database-currently represented by the db.sqlite3 file-within Docker. That means the actual db.sqlite3 file is changing each time. And thanks to the volumes mount in our docker-compose.yml config each file change has been copied over into a db.sqlite3 file on our local computer too. You could quit Docker, start the shell, start the server with python runserver, and see the exact same admin login at this point because the underlying SQLite database is the same.

Switching to PostgreSQL

Now it's time to switch over to PostgreSQL for our project which takes three additional steps:

  • install a database adapter, psycopg2, so Python can talk to PostgreSQL
  • update the DATABASE config in our file
  • install and run PostgreSQL locally
Let's get started. Stop the running Docker container with docker-compose down.

$ docker-compose down
Stopping postgresql_web_1 ... done
Removing postgresql_web_1 ... done
Removing network postgresql_default

Then within our docker-compose.yml file add a new service called db. This means there will be two separate services, each a container, running within our Docker host: web for the Django local server and db for our PostgreSQL database.

The PostgreSQL version will be pinned to the latest version, 15. If we had not specified a version number and instead used just postgres then the latest version of PostgreSQL would be downloaded even if at a later date that is Postgres 16 which will likely have different requirements.

then we add a depends_on line to our web service since it literally depends on the database to run. This means that db will be started up before web. Finally we add a volumes mount for our db service to persist our database information (sync local and Docker code).

version: '3.8'

        build: .
        command: python /code/ runserver
            - .:/code

            - 8000:8000

            - db

        image: postgres:15

            - postgres_data:/var/lib/postgresql/data/


Now run docker-compose up -d --build which will force rebuild our image and spin up two containers, one running PostgreSQL within db and the other our Django web server.

$ docker-compose up -d
Creating network "postgresql_default" with the default driver
Creating postgresql_db_1 ... done
Creating postgresql_web_1 ... don


With your text editor, open the config/ file and scroll down to the DATABASES config. The current setting is this:

# config/
    'default': {
        'ENGINE': 'django.db.backends.sqlite3',
        'NAME': BASE_DIR / 'db.sqlite3',

To switch over to PostgreSQL we will update the ENGINE configuration. PostgreSQL requires a NAME, USER, PASSWORD, HOST, and PORT.

It's insecure to hard-code the database credentials in settings and the recommended way to do it is through environment variables, all we have to do is to define our environment variables in the docker-compose.yml file like this:

version: '3.8'

        build: .
        command: python /code/ runserver
            - .:/code
            - 8000:8000
            - POSTGRES_NAME=postgres
          - POSTGRES_USER=postgres
          - POSTGRES_PASSWORD=postgres
            - db

        image: postgres:15
            - POSTGRES_NAME=postgres
          - POSTGRES_USER=postgres
          - POSTGRES_PASSWORD=postgres
            - postgres_data:/var/lib/postgresql/data/

then read them in our file like this:

# import os module at the top
import os 

# jump to the database settings section
    'default': {
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': os.environ.get('POSTGRES_NAME'),
        'USER': os.environ.get('POSTGRES_USER'),
        'PASSWORD': os.environ.get('POSTGRES_PASSWORD'),
        'HOST': 'db',
        'PORT': 5432

If you refresh the page now and run the command docker-compose logs, you will see an error module: No module named 'psycopg2' which tells us we haven't installed the psycopg2 driver yet. Psycopg is a database adapter used by Python to connect to the database itself.

We can install Pyscopg with Pipenv. On the command line, enter the following command so it is installed within our Docker host.

$ docker-compose exec web pipenv install psycopg2-binary==2.9.5

Why install within Docker rather than locally I hope you're asking? The short answer is that consistently installing new software packages within Docker and then rebuilding the image from scratch will save us from potential Pipfile.lock conflicts.

The Pipfile.lock generation depends heavily on the OS being used. We've specified our entire OS within Docker, including using Python 3.7. But if you install psycopg2 locally on your computer, which has a different environment, the resulting Pipfile.lock file will also be different. But then the volumes mount in our docker-compose.yml file, which automatically syncs the local and Docker filesystems, will cause the local Pipfile.lock to overwrite the version within Docker. So now our Docker container is trying to run an incorrect Pipfile.lock file!

To avoid that, after executing the previous command for instlaling new packages, we must stop docker containers and rebuild the image from scratch:

$ docker-compose down
$ docker-compose up -d --build

When installing new packages, make sure to rememeber this flow: first install the package within docker, then stop docker containers using docker-compose down and finally rebuild the image from scratch using docker-compose up --build (or -d --build for detached mode).

If you refresh the homepage again the Django welcome page at now works! That's because Django has successfully connected to PostgreSQL via Docker.

If youlook at the current logs again by typing docker-compose logs you'll see some database complaints like “You have 15 unapplied migrations(s)” and it's because we haven't initialized our database yet using the migrate command, we first need to create a custom user models before initializing the database. Let's do that!

Cusotm User Model

Time to implement a custom user model. Why? Because you will need to make changes to the built-in User model at some point in your project's life.

If you have not started with a custom user model from the very first migrate command you run, then you're in for a world of hurt because User is tightly interwoven with the rest of Django internally. It is challenging to switch over to a custom user model mid-project. Read More Here .

When it comes to customizing the django built-in user and as with many things Django-related, there are implementation choices: either extend AbstractUser which keeps the default User fields and permissions or extend AbstractBaseUser which is even more granular, and flexible, but requires more work.

We'll stick with the simpler AbstractUser in this article as AbstractBaseUser can be added later if needed. There are four steps for adding a custom user model to our project:

  1. Create a CustomUser model
  2. Update
  3. Customize UserCreationForm and UserChangeForm
  4. Add the custom user model to
The first step is to create a CustomUser model which will live within its own app. We could do this either locally within our virtual environment shell, meaning we'd go pipenv shell and then run python startapp users. However for consistency we'll run the majority of our commands within Docker itself.

Since we're working within Docker now as opposed to locally we must preface traditional commands with docker-compose exec [service] where we specify the name of the service as follow:
$ docker-compose exec web python startapp users

Create a new CustomUser model which extends AbstractUser. We're not making any changes yet so include the Python pass statement which acts as a placeholder for our future code.

# users/
from django.contrib.auth.models import AbstractUser
from django.db import models

class CustomUser(AbstractUser):

Now go in and update our file in the INSTALLED_APPS section to tell Django about our new users app. We also want to add a AUTH_USER_MODEL config at the bottom of the file which will cause our project to use CustomUser instead of the default User model.

# config/

    # Local
    'users.apps.UsersConfig', # new


AUTH_USER_MODEL = 'users.CustomUser' # new

Time to create a migrations file for the changes. We'll add the optional app name users to the command so that only changes to that app are included.

$ docker-compose exec web python makemigrations users
Migrations for 'users':
        - Create model CustomUser

Then run migrate to initialize the database for the very first time.

$ docker-compose exec web python migrate

Cusotm User Model

A user model can be both created and edited within the Django admin. So we'll need to update the built-in forms too to point to CustomUser instead of User.

Create a users/ file.

$ touch users/

In your text editor type in the following code to switch over to CustomUser.

# users/
from django.contrib.auth import get_user_model
from django.contrib.auth.forms import UserCreationForm, UserChangeForm

class CustomUserCreationForm(UserCreationForm):
    class Meta:
        model = get_user_model()
        fields = ('email', 'username',)
class CustomUserChangeForm(UserChangeForm):
    class Meta:
        model = get_user_model()
        fields = ('email', 'username',)

At the very top we've imported CustomUser model via get_user_model which looks to our AUTH_USER_MODEL config in This might feel a bit more circular than directly importing CustomUser here, but it enforces the idea of making one single reference to the custom user model rather than directly referring to it all over our project.

Next we import UserCreationForm and UserChangeForm which will both be extended.

Then create two new forms-CustomUserCreationForm and CustomUserChangeForm-that extend the base user forms imported above and specify swapping in our CustomUser model and displaying the fields email and username. The password field is implicitly included by default and so does not need to be explicitly named here as well.

Custom User Admin

Finally we have to update our users/ file. The admin is a common place to manipulate user data and there is tight coupling between the built-in User and the admin.

We'll extend the existing UserAdmin into CustomUserAdmin and tell Django to use our new forms, custom user model, and list only the email and username of a user. If we wanted to we could add more of the existing User fields to list_display such as is_staff.

# users/
from django.contrib import admin
from django.contrib.auth import get_user_model
from django.contrib.auth.admin import UserAdmin

from .forms import CustomUserCreationForm, CustomUserChangeForm

CustomUser = get_user_model()

class CustomUserAdmin(UserAdmin):
    add_form = CustomUserCreationForm
    form = CustomUserChangeForm
    model = CustomUser
    list_display = ['email', 'username',], CustomUserAdmin)

A bit of code upfront but this saves a ton of heartache later on.


A good way to confirm our custom user model is up and running properly is to create a superuser account so we can log into the admin. This command will access CustomUserCreationForm under the hood.

$ docker-compose exec web python createsuperuser

I've used the username zedkira47, the email address, and the password testpass123. You can use your own preferred variations here.

Now go to and confirm that you can log in. You should see your superuser name in the upper right corner on the post-log in page.


Since we've added new functionality to our project we should test it. Whether you are a solo developer or working on a team, tests are important. In the words of Django co-founder Jacob Kaplan-Moss, “Code without tests is broken as designed.”

There are two main types of tests:

  • Unit tests are small, fast, and isolated to a specific piece of functionality
  • Integration tests are large, slow, and used for testing an entire application or a user flow like payment that covers multiple screens.
You should write many unit tests and a small number of integration tests. There is no excuse for not writing a lot of tests; they will save you time. It's important to note that not everything needs to be tested. For example, any built-in Django features already contain tests in the source code. If we were using the default User model in our project we would not need to test it. But since we've created a CustomUser model we should.

Unit Tests

To write unit tests in Django we use TestCase which is, itself, an extension of Python's TestCase. Our users app already contains a file which is automatically added when the startapp command is used. Let's add some tests!

Each method must be prefaced with test in order to be run by the Django test suite. It is also a good idea to be overly descriptive with your unit test names since mature projects have hundreds if not thousands of tests!

# users/
from django.contrib.auth import get_user_model
from django.test import TestCase

class CustomUserTests(TestCase):

    def setUp(self):
        User = get_user_model()

        self.user = User.objects.create_user(

        self.admin_user = User.objects.create_superuser(

    def test_create_user(self):

        self.assertEqual(self.user.username, 'zedkira')
        self.assertEqual(, '')

    def test_create_superuser(self):

        self.assertEqual(self.admin_user.username, 'superadmin')
        self.assertEqual(, '')

Since the unit tests are executed top-to-bottom, we added a setUp method that will be run before every test where we can initialize class variables that can be used in all test methods we create.

To run our tests within Docker we'll prefix docker-compose exec web to the traditional command python test.

$ docker-compose exec web python test
Creating test database for alias 'default'...
System check identified no issues (0 silenced).
Ran 2 tests in 0.268s
Destroying test database for alias 'default'...

All the tests have passed, We’ve accomplished quite a lot so far!

Serving Static & Media Assets

A further step would be to store media files in a dedicated CDN (Content Delivery Network) for additional security. This can also be helpful for performance on very large sites for static files, but for media files it is a good idea regardless of the size.

But If your bottleneck is CPU load on the origin server, and not bandwidth, a CDN may not be the most appropriate solution. In this case, local caching using popular caches such as NGINX or Varnish may significantly reduce load by serving assets from system memory.

Before rolling out a CDN, additional optimization steps — like minifying and compressing JavaScript and CSS files, and enabling web server HTTP request compression — can also have a significant impact on page load times and bandwidth usage.

Performance In Django

The first priority for any website is that it must work properly and contain proper tests. But if your project is fortunate enough to receive a large amount of traffic the focus quickly shifts to performance and making things as efficient as possible. This is a fun and challenging exercise for many engineers, but it can also be a trap.

In this chapter we'll focus on the broad strokes of Django-related performance and highlight areas worth further investigation at scale. Generally speaking performance comes down to four major areas: optimizing database queries, caching, indexes, and compressing front-end assets like images, JavaScript, and CSS.

Performance Benchmarking

Before we can optimize performance, we first need to know where the inefficiencies lie in our code. django-debug-toolbar is a very handy tool that provides insights into what your code is doing and how much time it spends doing it. It comes with a configurable set of panels to inspect the complete request/response cycle of any given page.

We can install it within Docker and stop our running containers.

$ docker-compose exec web pipenv install django-debug-toolbar==3.8.1
$ docker-compose down

There are three separate configurations to set in our config/ file (still in postgresql project):

  1. Add Debug Toolbar to the INSTALLED_APPS configuration. Note that the proper name is debug_toolbar not django_debug_toolbar as might be expected.

        # Third-party
        'debug_toolbar', # new
        # Local

  2. Add Debug Toolbar to the Middleware where it is primarily implemented.

    # config/
        'debug_toolbar.middleware.DebugToolbarMiddleware', # new

  3. And set the INTERNAL_IPS as well. If we were not in Docker this could be set to '', however, since we're running our web server within Docker an additional step is required so that it matches the machine address of Docker. Add the following lines at the bottom of config/

    # config/
    # django-debug-toolbar
    import socket
    hostname, _, ips = socket.gethostbyname_ex(socket.gethostname())
    INTERNAL_IPS = [ip[:-1] + "1" for ip in ips]

  4. Now rebuild the base image so it contains the package and the updated settings configuration.

    # config/
    $ docker-compose up -d --build

  5. There's one last step and that is to update our URLconf. We only want Debug Toolbar to appear if DEBUG is true so we'll add logic to display it only in this case.

    # config/
    from django.contrib import admin
    from django.urls import path, include # new
    from config import settings
    urlpatterns = [
    if settings.DEBUG: # new
        import debug_toolbar
        urlpatterns = [
            path('__debug__/', include(debug_toolbar.urls)),
        ] + urlpatterns

  6. Now if you refresh the homepage you'll see the django-debug-toolbar on the righthand side, but you'll see a Page Not Found error, and that's because we only have 2 urls configured, one for the admin and one for the django-debug-toolbar, let's add a home page in the users app!

    # config/
    urlpatterns = [
        # Local Apps
        path('', include('users.urls')), # new

  7. Create a file in the users app and add the following code:

    # users/
    from django.urls import path
    from .views import HomePageView
    urlpatterns = [
        path('', HomePageView.as_view(), name='home'),

  8. Now let's create the HomePageView in users file:

    # users/
    from django.shortcuts import render
    from django.views.generic.base import TemplateView # new
    class HomePageView(TemplateView): # new
        template_name = "home.html" # new

  9. Instead of creating a templates directory in each app, a good practice is to configure django to look for templates in the root directory and in a folder that will create and name 'templates', go to config/ and configure that as follow:

    # config/
            'BACKEND': 'django.template.backends.django.DjangoTemplates',
            'DIRS': [os.path.join(BASE_DIR, 'templates')], # new
            'APP_DIRS': True,
            'OPTIONS': {
                'context_processors': [

  10. Create a folder in the root directory of the current project (inside postgresql directory) and name it 'templates':

    $ mkdir templates 
    $ ls
    config  docker-compose.yml  Dockerfile  Pipfile  Pipfile.lock  templates  users

  11. Finally, create a home.html file inside the templates folder and add the following code:

    # templates/home.html
    <!DOCTYPE html>
    <title>Home Page</title>
    <h1>Django Performance</h1>
    <p>Hello {{ request.user }}.</p>

    Navigate to and you should see a dummy home page and the django debug tool bar and the righthand side.

Analyzing Pages

we can see the current version of Django being used as well as the Time it took to load the page. Also the specific request called which was HomePageView. This may seem obvious but on large codebases especially if you are jumping in as a new developer, it may not be obvious which view is calling which page. Debug Toolbar is a helpful quickstart to understanding existing sites.

Probably the most useful item, however, is SQL which shows all the SQL queries that are being ran and the time for them. If you click on it even more data appears.

Large and poorly optimized sites can have hundreds or even thousands of queries being run on a single page!

select_related and prefetch_related

Generally speaking in a django web application, the less requests reach your database the faster the response would be, database queries are the slowest operations in Django, you should try to minimize database queries and make them as efficient a possible.

But what would you do if you find yourself working on a Django site with way too many SQL queries per page? In general, fewer large queries will be faster than many smaller queries, though it's possible and required to test this in practice. Two common techniques for doing so are select_related() and prefetch_related().

select_related is used for single-value relationships through a forward one-to-many or a one-to-one relationship. It creates a SQL join and includes the fields of the related object in the SELECT statement, which results in all related objects being included in a single more complex database query. This single query is typically more performant than multiple, smaller queries.

prefetch_related is used for a set or list of objects like a many-to-many or many-to-one relationship. Under the hood a lookup is done for each relationship and the “join” occurs in Python, not SQL. This allows it to prefetch many-to-many and many-to-one objects, which cannot be done using select_related, in addition to the foreign key and one-to-one relationships that are supported by select_related.

Implementing one or both on a website is a common first pass towards reducing queries and loading time for a given page.


A cache is an in-memory storing of an expensive calculation. Once executed it doesn't need to be run again! The two most popular options are Memcached which features native Django support and Redis which is commonly implemented with the django-redis third-party package.

Consider a dynamic website. Each time a user requests a page the server has to make various calculations including database queries, template rendering, and so on before servicing it. This takes time and is much slower than simply reading a file from a static site where the content does not change.

On large sites, though, this type of overhead can be quite slow and caching is one of the first solutions in a web developer's tool bag.

Django has its own cache framework274 which includes four different caching options in descending order of granularity:

  1. The per-site cache is the simplest to set up and caches your entire site.
  2. The per-view cache lets you cache individual views.
  3. Template fragment caching lets you specify a specific section of a template to cache.
  4. The low-level cache API lets you manually set, retrieve, and maintain specific objects in the cache.
Why not just cache everything all the time? One reason is that cache memory is expensive, as it's stored as RAM: think about the cost of going from 8GB to 16GB of RAM on your laptop vs. 256GB to 512GB of hard drive space. Another is the cache must be “warm,” that is filled with updated content, so depending upon the needs of a site, optimizing the cache so it is accurate, but not wasteful, takes quite a bit of tuning.


Indexing is a common technique for speeding up database performance. It is a separate data structure that allows faster searches and is typically only applied to the primary key in a model. The downside is that indexes require additional space on a disk so they must be used with care.

Tempting as it is to simply add indexes to primary keys from the beginning, it is better to start without them and only add them later based on production needs. A general rule of thumb is that if a given field is being used frequently, such as 10-25% of all queries, it is a prime candidate to be indexed.

An index field could be created by adding db_index=True to any model field:

import uuid

class Book(models.Model):
    id = models.UUIDField(
        db_index=True, # new

You can also add them in the Meta section:

from django.db import models
import uuid

class Book(models.Model):
    id = models.UUIDField(

    class Meta:
        indexes = [
            models.Index(fields=['id'], name='id_index'),


Another very popular third-party package for inspecting a Django project is django-extensions which adds a number of helpful custom extensions. One that is particularly helpful is shell_plus which will autoload all models into the shell which makes working with the Django ORM much easier.

Front-end Assets

Another major source of bottlenecks in a website is loading front-end assets. CSS and JavaScript can become quite large and therefore tools like django-compressor can help to minimize their size. But for truly large sites it is worth investigating the use of a Content Delivery Network (CDN).

References & Resources

Here's some helpful resources for a deep dive into Django Performance:

Security In Django

When it comes to security, Django has a pretty good record. This is due to security tools provided by Django, solid documentation on the subject of security, and a thoughtful team of core developers who are extremely responsive to security issues. However, it's up to individual Django developers such as ourselves to understand how to properly secure Django powered applications.

This section contains a list of things helpful for securing your Django application. This list is by no means complete. Consider it a starting point.

Social Engineering

The biggest security risk to any website is ultimately not technical: it is people. The term social engineering refers to the technique of finding individuals with access to a system who will willingly or unwillingly share their their login credentials with a bad actor.

All it takes is one bad click on an email link for a malicious actor to potentially gain access to the system, or at least all the access the compromised employee has.

To mitigate this risk, implement a robust permissions scheme and only provide the exact security access an employee needs, not more. Does every engineer need access to the production database? Probably not. Do non-engineers need write access? Again, probably not.These are discussions best had up front and a good default is to only add permissions as needed, not to default to superuser status for everyone!

Harden Your Servers

Search online for instructions and checklists for server hardening. Server hardening measures include but are not limited to things like setting up firewalls ( changing your SSH port, and disabling/removing unnecessary services.

Read More On Django's Security Features

They include:

  • Cross-site scripting (XSS) protection.
  • Cross-site request forgery (CSRF) protection.
  • SQL injection protection.
  • Clickjacking protection.
  • Support for TLS/HTTPS/HSTS, including secure cookies.
  • Automatic HTML escaping.
  • An expat parser hardened against XML bomb attacks.
  • Hardened JSON, YAML, and XML serialization/deserialization tools.
  • Secure password storage, using the PBKDF2 algorithm with a SHA256 hash by default.
Most of Django's security features “just work” out of the box without additional configuration, but there are certain things that you'll need to configure.

Turn Off DEBUG Mode in Production

Your production site should not be running in DEBUG mode. Attackers can find out more than they need to know about your production setup from a helpful DEBUG mode stack trace page.

Keep in mind that when you turn off DEBUG mode, you will need to set ALLOWED_HOSTS or risk raising a SuspiciousOperation error, which generates a 400 BAD REQUEST error that can be hard to debug.


This setting controls the host/domain names our Django site can serve. It likely exists right below DEBUG in the config/ file. By default in development it is set to [], an empty list. But for production, when DEBUG is False, it must be set explicitly and include values.

The two ways we access it locally which are via either or localhost. In production, you'll need to add your provider host name. for instance, if you are hosting your project in heroku's free tier, here's what your ALLOWED_HOSTS list would look like:

# config/
ALLOWED_HOSTS = ['', 'localhost', '']

To confirm, spin down the Docker host now and restart it!

Keep Your Secret Keys Secret

If the SECRET_KEY setting is not secret, depending on project setup, we risk an attacker gaining control of other people's sessions, resetting passwords, and more. Our API keys and other secrets should be carefully guarded as well. These keys should not even be kept in version control.

Now, let's take a look at how to generate a more secure django SECRET_KEY and store it in a Docker Environment Variable. All of your secret keys including API keys must be stored in Docker Environment Variables.

  1. The first thing to do is to generate a secure django SECRET_KEY, open a new command prompt or terminal and run the following command and copy the output random secret key:

    $ python -c "from import get_random_secret_key; print(get_random_secret_key())"

  2. In your text editor, open the docker-compose.yml file and add the new generated secret key in web environment as follow:

    # postgresql/docker-compose.yml  
        build: .
        command: python /code/ runserver
            - .:/code
            - 8000:8000
          - SECRET_KEY=h0$$mzfm=818osja*7vz$$jn)2zh%2zm#23u-38o980jc3^gcok
          - POSTGRES_NAME=postgres
          - POSTGRES_USER=postgres
          - POSTGRES_PASSWORD=postgres
            - db

  3. Finally change the SECRET_KEY in config/ file:

    # config/
    SECRET_KEY = os.environ.get('SECRET_KEY')

Web Security : SQL injection

A SQL injection attack occurs when a malicious user can execute arbitrary SQL code on a database. Consider a log in form on a site. What happens if a malicious user instead types DELETE from users WHERE user_id=user_- id? If this is run against the database without proper protections it could result in the deletion of all user records!

Fortunately the Django ORM automatically sanitizes user inputs by default when constructing querysets to prevent this type of attack. Where you need to be careful is that Django does provide the option to execute custom sql or raw queries. These should both be used with extreme caution since they could open up a vulnerability to SQL injection.

XSS (Cross Site Scripting)

XSS is another classic attack that occurs when an attacker is able to inject small bits of code onto web pages viewed by other people. This code, typically JavaScript, if stored in the database will then be retrieved and displayed to other users.

To prevent an XSS attack Django templates automatically escape specific characters that are potentially dangerous including brackets (< and >), single quotes ', double quotes ", and the ampersand &. There are some edge cases where you might want to turn autoescape off but this should be used with extreme caution.

One step we do want to take is to set SECURE_BROWSER_XSS_FILTER to True which will use the X-XSS-Protection Header to help guard against XSS attacks.

# config/

# production
if ENVIRONMENT == 'production':

Cross-Site Request Forgery (CSRF)

This is the third major type of attack but generally lesser known than SQL Injection or XSS. Fundamentally it exploits that trust a site has in a user's web browser.

When a user logs in to a website, let's call it a banking website for illustration purposes, the server sends back a session token for that user. This is included in the HTTP Headers of all future requests and authenticates the user. But what happens if a malicious actor somehow obtains access to this session token?

In practice there are multiple ways to obtain a user's credentials via a CSRF attack, not just links, but hidden forms, special image tags, and even AJAX requests.

Django provides CSRF protection by including a random secret key both as a cookie via CSRF Middleware and in a form via the csrf_token template tag. A 3rd party website will not have access to a user's cookies and therefore any discrepancy between the two keys causes an error.

As always, Django does allow customization: you can disable the CSRF middleware and use the csrf_protect() template tag on specific views. However, undertake this step with extreme caution.

Clickjacking Protection

Clickjacking is yet another attack when a malicious site tricks a user into clicking on a hidden frame. An internal frame, known as an iframe, is commonly used to embed one website within another. For example, if you wanted to include a Google Map or YouTube video on your site you would include the iframe tag that puts that site within your own. This is very convenient.

But it has a security risk which is that a frame can be hidden from a user. Consider if a user is already logged into their Amazon account and then visits a malicious site that purports to be a picture of kittens. The user clicks on said malicious site to see more kittens, but in fact they click an iFrame of an Amazon item that is unknowingly purchased. This is but one example of clickjacking.

To prevent against this Django comes with a default clickjacking middleware that checks whether or not a resource can be loaded within a frame or iframe. You can turn this protection off if desired or even set it at a per view level. As ever, do so with a degree of caution and research.

Django Updates

Keeping your project up-to-date with the latest version of Django is another important way to stay secure. Not only the latest feature release (2.2, 3.0, 3.1, etc) which comes out roughly every 9 months. There are also monthly security patch updates that take the form of 2.2.1, 2.2.2, 2.2.3, etc.

How to update? Django features deprecation warnings that can and should be run for each new release by typing python -Wa test. It is far better to update from 2.0 to 2.1 to 2.2 and run the deprecation warnings each time rather than skipping multiple versions.

Deployment Checklist

To assist with with deployment and checking security settings, the Django docs contain a dedicated deployment checklist that further describes security settings. Even better there is a command we can run to automate Django's recommendations, python check --deploy, that will check if a project is deployment ready.

It uses the Django system check framework which can be used to customize similar commands in mature projects. Since we are working in Docker we must prepend docker-compose exec web to the command though.

$ docker-compose exec web python check --deploy
System check identified some issues:
System check identified 9 issues (0 silenced).

The list can still go on and on, but this is only meant to be a good starting point to be aware how you can build professional django projects that scale and handle increasing traffic without any decrease in perofrmance as well as stand against any security breaches or threats. I will also provide down bellow a handful of resources that can be pretty helpful to dive deeper into the topics discussed here!

Where To Go Next?

The next step is to check out these resources to learn more about Performance and Security in Django, they will also help you become and advanced django developer:


We hope you enjoyed reading this article and learned a few things about starting an advanced django project, now we would like to hear from you, if you need more help or have any questions feel free to reach out to us using the contact form or by email.

Found this article helpful ?

Stay up to date in Tech

Subscribe to Newsletter