Building a Production Ready REST API with FastAPI
Because life's too short for Flask boilerplate and Django's kitchen sink approach
By the end of this post, you'll have a fully functional API with CRUD operations, automatic Swagger docs, PostgreSQL database, and everything wrapped up in a Docker container.
What We're Building
We're creating an Employee Management API that can:
Create new employee records
Read employee data (all or by ID)
Update existing employees
Delete employees (sorry, downsizing happens)
And the best part? FastAPI will generate interactive API documentation automatically.
Setting Up the Project with uv
Assuming you already have uv installed on your machine, execute the following commands to bootstrap a new project and add some dependencies
# Create project directory
mkdir employee-api
cd employee-api
# Initialize a new Python project
uv init
uv add fastapi email-validator uvicorn sqlalchemy psycopg2-binary alembic python-dotenvQuick breakdown of what we just installed:
fastapi: Our star performer, the web framework
uvicorn: ASGI (Asynchronous Server Gateway Interface) server to run our app
sqlalchemy: ORM for database operations
psycopg2-binary: PostgreSQL adapter
alembic: Database migration tool
python-dotenv: For managing environment variables
Project Structure
Let's organize our project like Marie Kondo would organize a closet
employee-api/
├── app/
│ ├── __init__.py
│ ├── main.py
│ ├── models.py
│ ├── schemas.py
│ ├── database.py
│ └── crud.py
├── .env
├── docker-compose.yml
├── Dockerfile
├── pyproject.toml
└── README.mdWe do this by running the following commands and creating files as required:
# create app dir
mkdir app
# create files
touch app/__init__.py app/main.py app/models.py app/schemas.py \
app/database.py app/crud.py
touch .env docker-compose.yml DockerfileDatabase Configuration
Let's set up our PostgreSQL connection. First, create a .env file and add the following configurations to it. The credentials to access the PostgreSql are aligned with the Docker container setup as mentioned in later sections:
# .env
DATABASE_URL=postgresql://employee_user:employee_pass@db:5432/employee_db
POSTGRES_USER=employee_user
POSTGRES_PASSWORD=employee_pass
POSTGRES_DB=employee_dbNever commit your
.envfile to Git as it usually contains some confidential data. Additionally, add it to.gitignoreso as to avoid any unintentional commits.
Now, let's create our database configuration in app/database.py
# app/database.py
import os
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
from dotenv import load_dotenv
load_dotenv()
DATABASE_URL = os.getenv(”DATABASE_URL”)
# Create database engine
engine = create_engine(DATABASE_URL)
# Create session factory
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
# Base class for models
Base = declarative_base()
# Dependency to get database session
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()Define the Employee Model
Time to create our database model in app/models.py
# app/models.py
from sqlalchemy import Column, Integer, String, Float, DateTime, Boolean
from sqlalchemy.sql import func
from app.database import Base
class Employee(Base):
__tablename__ = “employees”
id = Column(Integer, primary_key=True, index=True)
first_name = Column(String(50), nullable=False)
last_name = Column(String(50), nullable=False)
email = Column(String(100), unique=True, index=True, nullable=False)
department = Column(String(50), nullable=False)
position = Column(String(100), nullable=False)
salary = Column(Float, nullable=False)
is_active = Column(Boolean, default=True)
created_at = Column(DateTime(timezone=True), server_default=func.now())
updated_at = Column(DateTime(timezone=True), onupdate=func.now())This model represents our employee table with all the essential fields. Notice the created_at and updated_at timestamps? These are like those "seen at" timestamps in messaging apps, useful for tracking changes.
Create Pydantic Schemas
Pydantic schemas are like bouncers at a club; they make sure only the right data gets in. Create app/schemas.py
# app/schemas.py
from pydantic import BaseModel, EmailStr, Field
from datetime import datetime
from typing import Optional
class EmployeeBase(BaseModel):
first_name: str = Field(..., min_length=1, max_length=50)
last_name: str = Field(..., min_length=1, max_length=50)
email: EmailStr
department: str = Field(..., min_length=1, max_length=50)
position: str = Field(..., min_length=1, max_length=100)
salary: float = Field(..., gt=0)
is_active: bool = True
class EmployeeCreate(EmployeeBase):
pass
class EmployeeUpdate(BaseModel):
first_name: Optional[str] = Field(None, min_length=1, max_length=50)
last_name: Optional[str] = Field(None, min_length=1, max_length=50)
email: Optional[EmailStr] = None
department: Optional[str] = Field(None, min_length=1, max_length=50)
position: Optional[str] = Field(None, min_length=1, max_length=100)
salary: Optional[float] = Field(None, gt=0)
is_active: Optional[bool] = None
class EmployeeResponse(EmployeeBase):
id: int
created_at: datetime
updated_at: Optional[datetime] = None
class Config:
from_attributes = TrueWhat's happening here:
EmployeeBase: Shared fields for all schemasEmployeeCreate: For creating new employeesEmployeeUpdate: For partial updates (all fields optional)EmployeeResponse: What we send back to clients
CRUD Operations
Now for the fun part, let's create our CRUD operations in app/crud.py
# app/crud.py
from sqlalchemy.orm import Session
from sqlalchemy.exc import IntegrityError
from app import models, schemas
from fastapi import HTTPException, status
from typing import List, Optional
def create_employee(db: Session, employee: schemas.EmployeeCreate) -> models.Employee:
“”“Create a new employee”“”
db_employee = models.Employee(**employee.model_dump())
try:
db.add(db_employee)
db.commit()
db.refresh(db_employee)
return db_employee
except IntegrityError:
db.rollback()
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail=”Employee with this email already exists”
)
def get_employee(db: Session, employee_id: int) -> Optional[models.Employee]:
“”“Get employee by ID”“”
employee = db.query(models.Employee).filter(models.Employee.id == employee_id).first()
if not employee:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail=f”Employee with ID {employee_id} not found”
)
return employee
def get_employees(
db: Session,
skip: int = 0,
limit: int = 100,
department: Optional[str] = None,
is_active: Optional[bool] = None
) -> List[models.Employee]:
“”“Get all employees with optional filtering”“”
query = db.query(models.Employee)
if department:
query = query.filter(models.Employee.department == department)
if is_active is not None:
query = query.filter(models.Employee.is_active == is_active)
return query.offset(skip).limit(limit).all()
def update_employee(
db: Session,
employee_id: int,
employee_update: schemas.EmployeeUpdate
) -> models.Employee:
“”“Update an employee”“”
db_employee = get_employee(db, employee_id)
update_data = employee_update.model_dump(exclude_unset=True)
for field, value in update_data.items():
setattr(db_employee, field, value)
try:
db.commit()
db.refresh(db_employee)
return db_employee
except IntegrityError:
db.rollback()
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail=”Email already exists”
)
def delete_employee(db: Session, employee_id: int) -> dict:
“”“Delete an employee (soft delete by setting is_active to False)”“”
db_employee = get_employee(db, employee_id)
db_employee.is_active = False
db.commit()
return {”message”: f”Employee {employee_id} has been deactivated”}
def hard_delete_employee(db: Session, employee_id: int) -> dict:
“”“Permanently delete an employee”“”
db_employee = get_employee(db, employee_id)
db.delete(db_employee)
db.commit()
return {”message”: f”Employee {employee_id} has been permanently deleted”}Notice we have both soft delete (deactivation) and hard delete? It's like the difference between un-following someone and blocking them. Sometimes you want to keep the data for auditing purposes.
FastAPI Application
Now let's tie it all together in app/main.py
# app/main.py
from fastapi import FastAPI, Depends, Query
from sqlalchemy.orm import Session
from typing import List, Optional
from app import crud, models, schemas
from app.database import engine, get_db
# Create database tables
models.Base.metadata.create_all(bind=engine)
# Initialize FastAPI app
app = FastAPI(
title=”Employee Management API”,
description=”API for managing employee data with CRUD operations”,
version=”1.0.0”,
docs_url=”/docs”, # Swagger UI
redoc_url=”/redoc” # ReDoc
)
@app.get(”/”, tags=[”Root”])
def read_root():
“”“Welcome endpoint”“”
return {
“message”: “Welcome to Employee Management API”,
“docs”: “/docs”,
“redoc”: “/redoc”
}
@app.post(”/employees/”, response_model=schemas.EmployeeResponse, status_code=201, tags=[”Employees”])
def create_employee(
employee: schemas.EmployeeCreate,
db: Session = Depends(get_db)
):
“”“
Create a new employee with the following information:
- **first_name**: Employee’s first name
- **last_name**: Employee’s last name
- **email**: Unique email address
- **department**: Department name
- **position**: Job position
- **salary**: Annual salary (must be positive)
“”“
return crud.create_employee(db=db, employee=employee)
@app.get(”/employees/”, response_model=List[schemas.EmployeeResponse], tags=[”Employees”])
def read_employees(
skip: int = Query(0, ge=0, description=”Number of records to skip”),
limit: int = Query(100, ge=1, le=100, description=”Maximum number of records to return”),
department: Optional[str] = Query(None, description=”Filter by department”),
is_active: Optional[bool] = Query(None, description=”Filter by active status”),
db: Session = Depends(get_db)
):
“”“
Retrieve all employees with optional filtering:
- Pagination support via skip and limit
- Filter by department
- Filter by active status
“”“
employees = crud.get_employees(
db, skip=skip, limit=limit,
department=department, is_active=is_active
)
return employees
@app.get(”/employees/{employee_id}”, response_model=schemas.EmployeeResponse, tags=[”Employees”])
def read_employee(employee_id: int, db: Session = Depends(get_db)):
“”“Get a specific employee by ID”“”
return crud.get_employee(db=db, employee_id=employee_id)
@app.put(”/employees/{employee_id}”, response_model=schemas.EmployeeResponse, tags=[”Employees”])
def update_employee(
employee_id: int,
employee: schemas.EmployeeUpdate,
db: Session = Depends(get_db)
):
“”“
Update an employee’s information.
Only provided fields will be updated (partial update supported).
“”“
return crud.update_employee(db=db, employee_id=employee_id, employee_update=employee)
@app.delete(”/employees/{employee_id}”, tags=[”Employees”])
def delete_employee(employee_id: int, db: Session = Depends(get_db)):
“”“Soft delete an employee (sets is_active to False)”“”
return crud.delete_employee(db=db, employee_id=employee_id)
@app.delete(”/employees/{employee_id}/permanent”, tags=[”Employees”])
def hard_delete_employee(employee_id: int, db: Session = Depends(get_db)):
“”“Permanently delete an employee from the database”“”
return crud.hard_delete_employee(db=db, employee_id=employee_id)
@app.get(”/health”, tags=[”Health”])
def health_check():
“”“Health check endpoint for monitoring”“”
return {”status”: “healthy”}Swagger Documentation Magic: Notice those docstrings and the tags parameter? FastAPI automatically converts these into API documentation and an interactive playground.
Docker Configuration
Let's containerize the application. First, create docker-compose.yml
# docker-compose.yml
version: ‘3.8’
services:
db:
image: postgres:15-alpine
container_name: employee_db
environment:
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_DB: ${POSTGRES_DB}
ports:
- “5432:5432”
volumes:
- postgres_data:/var/lib/postgresql/data
healthcheck:
test: [”CMD-SHELL”, “pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}”]
interval: 10s
timeout: 5s
retries: 5
web:
build: .
container_name: employee_api
command: uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
volumes:
- .:/app
ports:
- “8000:8000”
environment:
DATABASE_URL: ${DATABASE_URL}
depends_on:
db:
condition: service_healthy
restart: unless-stopped
volumes:
postgres_data:Now let's create Dockerfile
# Dockerfile
FROM python:3.11-slim
# Set working directory
WORKDIR /app
# Install system dependencies
RUN apt-get update && apt-get install -y \
gcc \
postgresql-client \
&& rm -rf /var/lib/apt/lists/*
# Copy uv installer and install uv
COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv
# Copy dependency files
COPY pyproject.toml .
# Install dependencies using uv
RUN uv pip install --system -r pyproject.toml
# Copy application code
COPY . .
# Expose port
EXPOSE 8000
# Run the application
CMD [”uvicorn”, “app.main:app”, “--host”, “0.0.0.0”, “--port”, “8000”]Running the Application
# Build and start the containers
docker-compose up --build
# Or run in detached mode
docker-compose up -d --buildOnce everything is running, visit:
Swagger UI: http://localhost:8000/docs
ReDoc: http://localhost:8000/redoc
API Root: http://localhost:8000
Testing the API
Let's take our API for a test drive! Here are some example requests using Python
# test_api.py
import requests
BASE_URL = "http://localhost:8000"
# Create an employee
new_employee = {
"first_name": "John",
"last_name": "Doe",
"email": "john.doe@company.com",
"department": "Engineering",
"position": "Senior Developer",
"salary": 95000.00,
"is_active": True
}
response = requests.post(f"{BASE_URL}/employees/", json=new_employee)
print(f"Created Employee: {response.json()}")
# Get all employees
response = requests.get(f"{BASE_URL}/employees/")
print(f"All Employees: {response.json()}")
# Get specific employee
employee_id = 1
response = requests.get(f"{BASE_URL}/employees/{employee_id}")
print(f"Employee {employee_id}: {response.json()}")
# Update employee
update_data = {"salary": 105000.00, "position": "Lead Developer"}
response = requests.put(f"{BASE_URL}/employees/{employee_id}", json=update_data)
print(f"Updated Employee: {response.json()}")
# Filter by department
response = requests.get(f"{BASE_URL}/employees/?department=Engineering")
print(f"Engineering Employees: {response.json()}")
# Soft delete employee
response = requests.delete(f"{BASE_URL}/employees/{employee_id}")
print(f"Delete Response: {response.json()}")You can also use the interactive Swagger playground at /docs to test all endpoints without writing any code. Just click, fill, and submit.
What's Next?
Congratulations! You've built a production-ready REST API. Here are some ideas to level up:
Add Authentication: Implement JWT tokens for securing your endpoints
Database Migrations: Use Alembic for managing schema changes
Testing: Add pytest for unit and integration tests
Logging: Implement structured logging for better debugging
Monitoring: Add Prometheus metrics and health checks
API Versioning: Prepare for future changes with
/v1/employees/Rate Limiting: Prevent abuse with request throttling
CORS Configuration: Enable front-end applications to consume your API
A complete working example of the post is available in Github.

