<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Mahesh</title>
    <description>The latest articles on Forem by Mahesh (@user-mahesh).</description>
    <link>https://forem.com/user-mahesh</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/user-mahesh"/>
    <language>en</language>
    <item>
      <title>Deploy your Django app with Postgres DB to production using Docker and Nginx</title>
      <dc:creator>Mahesh</dc:creator>
      <pubDate>Mon, 21 Apr 2025 16:13:43 +0000</pubDate>
      <link>https://forem.com/user-mahesh/deploy-your-django-app-with-postgres-db-to-production-using-docker-and-nginx-1g88</link>
      <guid>https://forem.com/user-mahesh/deploy-your-django-app-with-postgres-db-to-production-using-docker-and-nginx-1g88</guid>
      <description>&lt;p&gt;If you have came across this blog, Congrats!🥳, you are doing your best!&lt;br&gt;
Running a Django application on your local system is quite easy. It doesn't matter if you are a newbie or an skilled professional, everyone sucks in deployment.&lt;/p&gt;

&lt;p&gt;I will try to cover up most of the things to deploy your Django app to production efficiently. I'll show you how to use Docker and Nginx to create a secure, scalable deployment setup for your Django project and later in the blog, you can find why i used this process. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why Deploy Django with Docker and Nginx?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Docker simplifies development and deployment by containerising your application, ensuring consistency across environments.&lt;/li&gt;
&lt;li&gt;Nginx is a high-performance web server that serves static files and acts as a reverse proxy for your Django app, optimising performance and security.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Your project structure should look like this currently&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.venv/
my_awesome_project/
├── app/
    ├── settings.py
    ├── wsgi.py
├── module1/
    ├── ....
├── module2/
    ├── ....
├── manage.py
├── requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To dockerize your project, you need to add 2 files in your root directory.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Dockerfile&lt;/li&gt;
&lt;li&gt;docker-compose.yaml&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Note: You should have &lt;strong&gt;docker-desktop&lt;/strong&gt; installed to test the docker containers.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Step 1&lt;/strong&gt; : Add Dockerfile
&lt;/h2&gt;

&lt;p&gt;Your structure should look like this now.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.venv/
my_awesome_project/
├── app/
    ├── settings.py
    ├── wsgi.py
├── module1/
    ├── ....
├── module2/
    ├── ....
├── manage.py
├── requirements.txt
├── Dockerfile
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To add some overview,&lt;br&gt;
A &lt;code&gt;Dockerfile&lt;/code&gt; is a script that defines how to build a Docker image for your application. Think of it as the blueprint for creating your app’s environment. With your Dockerfile, you can create as many containers as you want.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Below is the Dockerfile which i personally use&lt;br&gt;
&lt;/p&gt;


&lt;/blockquote&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;FROM python:3.13-slim AS runtime

# set environment variables
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
ENV DOCKERIZED=true
ENV MICRO_SERVICE=/app

# set work directory
RUN mkdir -p $MICRO_SERVICE
RUN mkdir -p $MICRO_SERVICE/staticfiles
RUN mkdir -p $MICRO_SERVICE/mediafiles

# where the code lives
WORKDIR $MICRO_SERVICE

RUN apt-get update &amp;amp;&amp;amp; apt-get install -y --no-install-recommends \
    build-essential \
    libpq-dev \
    libjpeg-dev \
    zlib1g-dev \
    curl \
    &amp;amp;&amp;amp; rm -rf /var/lib/apt/lists/*

# copy requirements.txt for installing python libraries
COPY requirements.txt $MICRO_SERVICE

#check for pip updates and install requirements.txt
RUN pip install --upgrade pip &amp;amp;&amp;amp; pip install --no-cache-dir -r requirements.txt

# copy project
COPY . $MICRO_SERVICE

# Collect static files (optional)
RUN python manage.py collectstatic --noinput
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This basically creates missing directories, update system packages and installs python packages from requirements, moves the project to the 'app' directory and collects static files if present in the project.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Step 2&lt;/strong&gt; : Add docker-compose.yml file
&lt;/h2&gt;

&lt;p&gt;Your structure should look like this now.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.venv/
my_awesome_project/
├── app/
    ├── settings.py
    ├── wsgi.py
├── module1/
    ├── ....
├── module2/
    ├── ....
├── manage.py
├── requirements.txt
├── Dockerfile
├── docker-compose.yml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To add some overview,&lt;br&gt;
A &lt;code&gt;docker-compose.yml&lt;/code&gt; defines and manages multi-container applications. It simplifies running complex apps with just one command.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Below is the &lt;code&gt;docker-compose.yml&lt;/code&gt; which i personally use for local testing.&lt;br&gt;
&lt;/p&gt;


&lt;/blockquote&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;services:
  db:
    image: postgres:15-alpine
    container_name: postgres_db
    volumes:
      - db_data:/var/lib/postgresql/data/
    env_file:
      - .env
    ports:
      - 5432:5432
    restart: always
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U $POSTGRES_USER -d $POSTGRES_DB"]
      interval: 5s
      timeout: 5s
      retries: 5

  backend:
    build:
      context: .
      dockerfile: Dockerfile
    container_name: django_app
    command: gunicorn --config configs/gunicorn_cfg.py app.wsgi:application
    volumes:
      - .:/app:cached
      - static_volume:/app/staticfiles
      - media_volume:/app/mediafiles
    working_dir: /app
    expose:
      - 8000
    env_file:
      - .env
    restart: always
    healthcheck:
      test: ["CMD-SHELL", "python manage.py check --database default --deploy --fail-level CRITICAL"]
      interval: 10s
      timeout: 5s
      retries: 3

  nginx:
    image: nginx:stable-alpine
    container_name: nginx_server
    restart: always
    ports:
      - "80:80"
      - "443:443" # If you plan to use HTTPS
    volumes:
      - ./nginx/nginx.conf:/etc/nginx/nginx.conf:ro
      - static_volume:/app/staticfiles:ro
      - media_volume:/app/mediafiles:ro
    depends_on:
      - backend

volumes:
  db_data:
  static_volume:
  media_volume:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This compose file will create 3 services: &lt;em&gt;django_app, postgres_db, nginx_server&lt;/em&gt;.&lt;br&gt;
No need to do anything now.&lt;/p&gt;
&lt;h2&gt;
  
  
  &lt;strong&gt;Step 3&lt;/strong&gt; : Adding .env file
&lt;/h2&gt;

&lt;p&gt;Before testing our containers, we need to create a &lt;code&gt;.env&lt;/code&gt; file in the project folder.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Below settings are generally stored in the &lt;code&gt;.env&lt;/code&gt; file.&lt;br&gt;
&lt;/p&gt;


&lt;/blockquote&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;DEBUG=True
SECRET_KEY='1234567890'
ACCESS_SECRET_KEY='0987654321'
ALLOWED_HOSTS=localhost,127.0.0.1
DB_ENGINE=django.db.backends.postgresql
POSTGRES_DB=app_db
POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres
POSTGRES_HOST=db
POSTGRES_PORT=5432
DOCKER_RUNNING=true
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Please note that &lt;strong&gt;POSTGRES_HOST&lt;/strong&gt; is using &lt;em&gt;service name&lt;/em&gt; from the &lt;code&gt;docker-compose.yml&lt;/code&gt; file which is &lt;strong&gt;db&lt;/strong&gt;. This is required if you are creating your db container. In case of remote DB, this would be the host URL.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;When deploying to production, you would not need the postgres_db as you will be using a centralised DB. The DB credentials will be saved in the &lt;code&gt;.env&lt;/code&gt; file.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Your structure should look like this now.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.venv/
my_awesome_project/
├── app/
    ├── settings.py
    ├── wsgi.py
├── ....
├── manage.py
├── requirements.txt
├── Dockerfile
├── docker-compose.yml
├── .env
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  &lt;strong&gt;Step 4&lt;/strong&gt; : Adding gunicorn config
&lt;/h2&gt;

&lt;p&gt;To run our application efficiently(for production) on a machine, we need to use gunicorn. I have already added the command in the &lt;code&gt;docker-compose&lt;/code&gt; file. Let's add the config for it.&lt;br&gt;
Your structure should look like this now.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;...
my_awesome_project/
├── app/
    ├── settings.py
    ...
├── ....
├── configs/
    ├── gunicorn_cfg.py
├── manage.py
├── requirements.txt
├── Dockerfile
├── docker-compose.yml
├── .env
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And inside the &lt;code&gt;gunicorn_cfg.py&lt;/code&gt; put the below config.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import multiprocessing

# Django WSGI application path in pattern MODULE_NAME:VARIABLE_NAME
wsgi_app = "app.wsgi:application"

# The granularity of Error log outputs
loglevel = "info"
backlog = 2048

# The number of worker processes for handling requests
workers = multiprocessing.cpu_count() * 2 + 1
worker_class = "gthread"
threads = 2

# The socket to bind
bind = "0.0.0.0:8000"
timeout = 60

# Restart workers when code changes (development only!)
reload = True

keepalive = 5  # The number of seconds to wait for requests on a Keep-Alive connection
errorlog = "-"  # '-' logs to stderr
accesslog = "-"  # '-' logs to stdout

# Daemonize the Gunicorn process (detach &amp;amp; enter background)
daemon = False
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  &lt;strong&gt;Step 5&lt;/strong&gt; : Adding nginx to the docker
&lt;/h2&gt;

&lt;p&gt;There are a lot of benefits of adding a nginx container and not to install it on the machine. The main is, you don't have to worry about configs which get missing-in-action or are mismatched in different environments.&lt;br&gt;
Update your folder structure to look like below&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;...
my_awesome_project/
├── app/
    ├── settings.py
    ...
├── ....
├── configs/
    ├── gunicorn_cfg.py
├── nginx/
    ├──nginx.conf
├── manage.py
├── requirements.txt
├── Dockerfile
├── docker-compose.yml
├── .env
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now in the &lt;code&gt;nginx.conf&lt;/code&gt; add the following lines and &lt;strong&gt;DO NOT MISS ANYTHING&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;events {}
http {
    include /etc/nginx/modules-enabled/*.conf;
    include /etc/nginx/mime.types;

    upstream backed_api {
        server backend:8000;
    }

    server {
        listen 80;
        server_name localhost;

        location /favicon.ico {
            access_log off;
            log_not_found off;
        }

        error_log /var/log/nginx/error.log;
        access_log /var/log/nginx/access.log;

        location /static/ {
            autoindex on;
            alias /app/staticfiles/;
        }

        location /media/ {
            autoindex on;
            alias /app/mediafiles/;
        }

        location / {
            proxy_pass http://backed_api;
            proxy_set_header  Host              $http_host;   # required for docker client's sake
            proxy_set_header  X-Real-IP         $remote_addr; # pass on real client's IP
            proxy_set_header  X-Forwarded-For   $proxy_add_x_forwarded_for;
            proxy_set_header  X-Forwarded-Proto $scheme;
            proxy_read_timeout                  900;
            proxy_redirect off;
            client_max_body_size 10M;
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Make sure that you change your upstream server(it should match with compose service i.e. &lt;code&gt;backend&lt;/code&gt; in this case. The static URL should alias to &lt;code&gt;STATIC_ROOT&lt;/code&gt; mentioned in your settings.py file.&lt;/p&gt;

&lt;p&gt;With this, your application is now dockerized.&lt;br&gt;
Open the docker desktop. This will create a docker daemon which interacts with the docker commands.&lt;br&gt;
To build the containers, run the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker compose -f docker-compose.yml build
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To run the application in normal mode use command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker compose -f docker-compose.yml up -d
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will create 3 containers in your docker environment with the above configuration.&lt;br&gt;
You will see something like the below image:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwu6pa6xlf58deb3a6d9n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwu6pa6xlf58deb3a6d9n.png" alt="Image description" width="800" height="392"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can access your application on: &lt;code&gt;localhost:80&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Congratulations!! your application is now system independent.&lt;br&gt;
No need to worry about virtual environments or system updates.&lt;/p&gt;

&lt;p&gt;For small and medium applications this will just work wonders! For large and complex applications, you might need to create another virtual machine for your application like an EC2 instance. In those cases, it is preferable to use either docker-swarm or kubernetes.&lt;/p&gt;

&lt;p&gt;If you are building your college project or building a MVP for your startup, this is the best and fastest way to deploy your application.&lt;br&gt;
It will help with your static files,  your media files, your URL safety, everything.&lt;br&gt;
With this configuration, your application should be able to handle &lt;strong&gt;1 million&lt;/strong&gt; requests per day.&lt;/p&gt;



&lt;p&gt;&lt;strong&gt;Bonus Content&lt;/strong&gt;&lt;br&gt;
We deployed the application using WSGI file present in the Django project. WSGI do not allow web-sockets/real-time tasks. To support that you need to use ASGI.&lt;br&gt;
You can read more about it on the internet. The baseline is:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;WSGI is synchronous and handles requests sequentially, while ASGI is asynchronous and allows for concurrent handling of multiple requests.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Now, to incorporate it in our project we are going to install &lt;code&gt;uvicorn&lt;/code&gt;, which is by far the out-performer in Python. We will use it's worker with &lt;code&gt;gunicorn&lt;/code&gt; to make our application asynchronous.&lt;br&gt;
Install the package using below command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install uvicorn uvicorn-worker
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can change the following line in gunicorn_cfg.py&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# wsgi_app = "backend.wsgi:application"
...
# worker_class = "gthread"
worker_class = "uvicorn.workers.UvicornWorker"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and in the docker-compose file we need to change the command to use asgi file from our project&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;...
  web:
    build:
      context: .
      dockerfile: Dockerfile
    command: gunicorn -c configs/gunicorn_cfg.py backend.asgi:application
    volumes:
      - .:/app:cached
      - static_volume:/app/staticfiles
      - media_volume:/app/mediafiles
...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Build and run the containers once again. You application should start in the async mode.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Why Nginx container?&lt;/strong&gt;&lt;br&gt;
While it's true you can install and configure Nginx directly on your production machine, using a separate Nginx container within your Docker Compose setup offers several compelling advantages, especially in modern deployment workflows, like&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Isolation and Consistency&lt;/strong&gt;:
The Nginx container runs in its own isolated environment, with its own dependencies and configuration. You ensure a consistent Nginx setup regardless of the underlying operating system or existing software on the server. The Nginx container image guarantees that you're deploying the exact same Nginx version and configuration across different environments (development, staging, production). This significantly reduces the "it works on my machine" problem.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scalability&lt;/strong&gt;:
Easier Scaling: If your web traffic increases, you can easily scale the number of Nginx containers horizontally (add more instances) without affecting your application or database directly. Docker orchestration tools like Docker Swarm or Kubernetes make this process much smoother.
Load Balancing: When you have multiple instances of your Django application (scaling the web service), Nginx, acting as a reverse proxy, can efficiently distribute incoming traffic across these instances, improving performance and resilience.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Portability&lt;/strong&gt;:
Environment Agnostic: Your Docker Compose setup can be easily moved and deployed to different cloud providers or on-premise environments that support Docker without needing to reconfigure Nginx specifically for each environment.&lt;/li&gt;
&lt;/ol&gt;




&lt;p&gt;&lt;strong&gt;When and why we need db container?&lt;/strong&gt;&lt;br&gt;
You should use a DB container when:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;You're developing locally and want a consistent, isolated database environment.&lt;/li&gt;
&lt;li&gt;You're deploying to staging/production and want the database to be part of your infrastructure stack.&lt;/li&gt;
&lt;li&gt;You need to ensure compatibility between your app and a specific version/config of a database like PostgreSQL or MySQL.&lt;/li&gt;
&lt;li&gt;You're testing features or migrations and want a throwaway DB you can spin up/tear down easily.&lt;/li&gt;
&lt;li&gt;You want to avoid polluting your host OS with multiple database installs or versions.&lt;/li&gt;
&lt;/ol&gt;




&lt;p&gt;&lt;strong&gt;Can we create containers with Dockerfile?&lt;/strong&gt;&lt;br&gt;
Technically, yes. Think of it this way: While you could manage each component (Django, Nginx, PostgreSQL) as separate services directly on your machine, docker compose provides a way to orchestrate these services as a cohesive application stack. No need to run multiple docker run commands manually.&lt;/p&gt;

&lt;p&gt;With &lt;code&gt;docker compose up&lt;/code&gt;, you can spin up your entire application stack. It handles network creation, volume mounting, and service orchestration automatically.&lt;/p&gt;

&lt;p&gt;Other benefits include:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Built-in Networking&lt;/strong&gt;:
Services can communicate with each other using service names (e.g., web, db) instead of IP addresses—making it easy to link Django to PostgreSQL or Nginx.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Centralised Configuration&lt;/strong&gt;:
Your app, database, reverse proxy, and environment variables can all be configured in one place. Easier to read, debug, and modify.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scalability &amp;amp; Testing&lt;/strong&gt;:
You can scale services using docker-compose up --scale, and test production-like setups locally—ideal for integration testing and CI pipelines.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Volume &amp;amp; Data Management&lt;/strong&gt;:
Compose makes it easy to define and persist volumes (e.g., for databases or static files), ensuring your data isn’t lost when containers restart.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Isolation&lt;/strong&gt;:
Each environment (dev, staging, prod) can have its own docker-compose file or override settings with .env files for secure and environment-specific configurations.&lt;/li&gt;
&lt;/ol&gt;




&lt;p&gt;I hope you like this article. Thank you!&lt;/p&gt;

</description>
      <category>docker</category>
      <category>python</category>
      <category>postgres</category>
      <category>nginx</category>
    </item>
  </channel>
</rss>
