Back to Course|Build a Production REST API: From Zero to Deployed with FastAPI
Lab

Dockerize & Deploy TaskFlow

35 min
Intermediate
Unlimited free attempts

Instructions

Objective

Package the TaskFlow API for production deployment. You will create a multi-stage Dockerfile, a production Docker Compose configuration, a GitHub Actions CI/CD pipeline, and production-hardened application settings.

Part 1: Dockerfile (25 points)

Create a Dockerfile in the project root with a multi-stage build.

Requirements

Builder stage:

  • Use python:3.12-slim as the base image (pinned, not latest)
  • Install system dependencies needed to compile Python packages (build-essential, libpq-dev)
  • Copy requirements.txt first, then install dependencies with --no-cache-dir and --prefix=/install
  • This ordering ensures Docker layer caching works -- dependency layers rebuild only when requirements.txt changes

Production stage:

  • Use python:3.12-slim again as a clean base
  • Install only runtime system libraries (libpq5, curl)
  • Create a non-root user and group named taskflow
  • Copy installed Python packages from the builder stage using COPY --from=builder
  • Copy application code
  • Set ownership to the taskflow user and switch to it with USER taskflow
  • Add a HEALTHCHECK that curls the /health endpoint
  • Expose port 8000
  • Use Gunicorn with Uvicorn workers as the entrypoint: gunicorn app.main:app -w 4 -k uvicorn.workers.UvicornWorker --bind 0.0.0.0:8000

Example Structure

# Stage 1: Builder
FROM python:3.12-slim AS builder
WORKDIR /app
# Install build dependencies
# Copy and install Python dependencies

# Stage 2: Production
FROM python:3.12-slim AS production
# Install runtime dependencies only
# Create non-root user
# Copy from builder
# Copy application code
# Set user, healthcheck, expose, cmd

Part 2: Docker Compose Production (25 points)

Create docker-compose.prod.yml with three services.

Requirements

API service:

  • Build from the Dockerfile, targeting the production stage
  • Map port 8000
  • Load environment from .env.production
  • Depend on db and redis with condition: service_healthy
  • Set resource limits: 1 CPU, 512MB memory
  • Set restart: unless-stopped
  • Connect to both external and internal networks

PostgreSQL service:

  • Use postgres:18 image
  • Persist data with a named volume pgdata mounted at /var/lib/postgresql/data
  • Configure database name, user, and password via environment variables
  • Add a healthcheck using pg_isready
  • Connect only to the internal network

Redis service:

  • Use redis:7-alpine image
  • Enable AOF persistence with --appendonly yes
  • Set max memory to 128MB with allkeys-lru eviction policy
  • Persist data with a named volume redisdata mounted at /data
  • Add a healthcheck using redis-cli ping
  • Connect only to the internal network

Networks:

  • external: default bridge network (API is reachable from outside)
  • internal: set internal: true so db and redis are isolated from the host

Volumes:

  • pgdata and redisdata as named volumes

Part 3: GitHub Actions CI/CD (25 points)

Create .github/workflows/ci.yml with a CI/CD pipeline.

Requirements

Trigger: on push to main branch and on pull_request to main

Test job:

  • Runs on ubuntu-latest
  • Spin up PostgreSQL 18 and Redis 7 as service containers with healthchecks
  • Steps:
    1. actions/checkout@v4
    2. actions/setup-python@v5 with Python 3.12
    3. Install dependencies from requirements.txt
    4. Run linter: ruff check .
    5. Run tests: pytest --cov=app tests/
  • Pass DATABASE_URL, REDIS_URL, and SECRET_KEY as environment variables to the test step

Build job:

  • Runs only on push to main (not on pull requests)
  • Depends on the test job passing (needs: test)
  • Steps:
    1. actions/checkout@v4
    2. Build Docker image tagged with the commit SHA

Part 4: Production Configuration (25 points)

Create or update the following application files.

4a. Gunicorn Configuration

Create gunicorn.conf.py in the project root:

import multiprocessing

workers = multiprocessing.cpu_count() * 2 + 1
worker_class = "uvicorn.workers.UvicornWorker"
bind = "0.0.0.0:8000"
accesslog = "-"
errorlog = "-"
loglevel = "info"

4b. Security Headers Middleware

Create a middleware in app/middleware/security.py that adds these headers to every response:

  • X-Content-Type-Options: nosniff
  • X-Frame-Options: DENY
  • Strict-Transport-Security: max-age=31536000; includeSubDomains
  • X-XSS-Protection: 1; mode=block
  • Referrer-Policy: strict-origin-when-cross-origin
from starlette.middleware.base import BaseHTTPMiddleware
from starlette.requests import Request
from starlette.responses import Response

class SecurityHeadersMiddleware(BaseHTTPMiddleware):
    async def dispatch(self, request: Request, call_next) -> Response:
        response = await call_next(request)
        response.headers["X-Content-Type-Options"] = "nosniff"
        response.headers["X-Frame-Options"] = "DENY"
        response.headers["Strict-Transport-Security"] = (
            "max-age=31536000; includeSubDomains"
        )
        response.headers["X-XSS-Protection"] = "1; mode=block"
        response.headers["Referrer-Policy"] = "strict-origin-when-cross-origin"
        return response

4c. CORS Configuration

In app/main.py, configure CORS with explicit origins for production (never use * as the wildcard origin in production):

from fastapi.middleware.cors import CORSMiddleware

if settings.is_production:
    allowed_origins = ["https://taskflow.example.com"]
else:
    allowed_origins = ["http://localhost:3000"]

app.add_middleware(
    CORSMiddleware,
    allow_origins=allowed_origins,
    allow_credentials=True,
    allow_methods=["GET", "POST", "PUT", "DELETE"],
    allow_headers=["Authorization", "Content-Type"],
)

4d. Environment-Based Settings

Create app/config.py using Pydantic Settings to load and validate all configuration from environment variables:

from pydantic_settings import BaseSettings

class Settings(BaseSettings):
    database_url: str
    redis_url: str
    secret_key: str
    environment: str = "development"
    api_v1_prefix: str = "/api/v1"

    @property
    def is_production(self) -> bool:
        return self.environment == "production"

    model_config = {"env_file": ".env"}

settings = Settings()

4e. API Versioning

All routers must be mounted under the /api/v1/ prefix:

from app.config import settings

app.include_router(tasks_router, prefix=settings.api_v1_prefix)
app.include_router(auth_router, prefix=settings.api_v1_prefix)

What to Submit

Your submission should contain 7 file sections in the editor below. Each section begins with a # FILE N: header.


Hints

  • Test your Dockerfile locally with docker build -t taskflow . before submitting
  • Run docker compose -f docker-compose.prod.yml up to verify all services start and healthchecks pass
  • The order of COPY instructions in the Dockerfile matters for caching -- put rarely-changing files first
  • For the CI workflow, use the services key under the job to spin up PostgreSQL and Redis

Grading Rubric

Dockerfile uses multi-stage build with pinned python:3.12-slim base, proper COPY ordering for layer caching (requirements.txt before app code), non-root user created and activated, HEALTHCHECK instruction present, and Gunicorn with UvicornWorker as CMD25 points
Docker Compose production file defines three services (api, db, redis) with PostgreSQL 18 and Redis 7, named volumes for persistence, healthchecks on all services, resource limits on API, restart policy, and network isolation (internal network for db/redis, external for API)25 points
GitHub Actions workflow triggers on push to main and pull requests, uses PostgreSQL and Redis service containers with healthchecks, runs checkout, Python 3.12 setup, dependency install, ruff linter, and pytest with coverage, and includes a build job that depends on test passing and only runs on main25 points
Production configuration includes Gunicorn config with UvicornWorker, SecurityHeadersMiddleware adding all 5 security headers, CORS with explicit origins (not wildcard) based on environment, Pydantic Settings with env_file loading and is_production property, and all routers mounted under /api/v1/ prefix25 points

Checklist

0/7

Your Solution

Unlimited free attempts