Last updated: Aug 1, 2025, 02:00 PM UTC

🐳 Docker Development Setup Guide

Status: Complete Implementation Guide
Version: 1.0
Purpose: Step-by-step Docker containerization procedures for development environments
Applicable To: Any multi-service application development


Overview

This guide provides comprehensive Docker setup procedures for creating consistent, production-ready development environments. It eliminates "works on my machine" problems and enables 5-minute environment setup across all team members.

Key Benefits

  • 1-command startup: Complete environment in one command
  • Production parity: Identical containers from dev to production
  • Team consistency: Every developer has identical environment
  • Service isolation: Each component runs in optimized containers
  • Zero conflicts: No more dependency hell or version mismatches

Prerequisites Verification

Before beginning, verify your system meets Docker requirements:

# Check Docker version (requires 20.10 or later)
docker --version
# Should output: Docker version 20.10.x or higher

# Check Docker Compose version (requires 2.0 or later)
docker-compose --version
# Should output: Docker Compose version 2.x.x or higher

# Verify Docker is running
docker info
# Should show Docker system information

# Verify Git installation
git --version

Requirements:

  • Docker 20.10+ for optimal container performance and security
  • Docker Compose v2+ for improved service orchestration
  • Git 2.30+ for modern repository workflows
  • 8GB RAM minimum for full service stack

Docker Compose Configuration

Step 1: Create Core docker-compose.yml

Create the foundation Docker environment configuration:

# docker-compose.yml - Complete development environment
version: '3.8'

services:
  # Application Service
  app:
    build:
      context: .
      dockerfile: Dockerfile.dev
    ports:
      - "3000:3000"
    volumes:
      - .:/app
      - /app/node_modules
      - /app/.next
    environment:
      - NODE_ENV=development
      - DATABASE_URL=postgresql://postgres:postgres@database:5432/postgres
    depends_on:
      - database
      - redis
    networks:
      - app-network

  # PostgreSQL Database
  database:
    image: postgres:15-alpine
    ports:
      - "5432:5432"
    environment:
      - POSTGRES_PASSWORD=postgres
      - POSTGRES_DB=postgres
    volumes:
      - postgres-data:/var/lib/postgresql/data
      - ./migrations:/docker-entrypoint-initdb.d
    networks:
      - app-network

  # Redis Cache
  redis:
    image: redis:7-alpine
    ports:
      - "6379:6379"
    volumes:
      - redis-data:/data
    networks:
      - app-network

  # Email Testing (MailDev)
  maildev:
    image: maildev/maildev:latest
    ports:
      - "1080:1080"  # Web interface
      - "1025:1025"  # SMTP server
    networks:
      - app-network

volumes:
  postgres-data:
  redis-data:

networks:
  app-network:
    driver: bridge

Step 2: Create Development Dockerfile

Create optimized Dockerfile for development:

# Dockerfile.dev - Development container
FROM node:20-alpine

# Set working directory
WORKDIR /app

# Install dependencies for better Docker layer caching
COPY package*.json ./
RUN npm ci --only=development

# Copy source code
COPY . .

# Generate any build artifacts (adjust for your framework)
# RUN npx prisma generate  # For Prisma projects
# RUN npm run build        # For build-required projects

# Expose port
EXPOSE 3000

# Start development server with hot reload
CMD ["npm", "run", "dev"]

Step 3: Environment Configuration

Create environment file for Docker secrets:

# .env - Docker environment variables (do not commit to git)
DATABASE_URL=postgresql://postgres:postgres@database:5432/postgres
REDIS_URL=redis://redis:6379
SMTP_HOST=maildev
SMTP_PORT=1025

# Add application-specific variables
JWT_SECRET=your-super-secret-jwt-token-with-at-least-32-characters
API_KEY=your-api-key

πŸ—„οΈ Database Setup & Migrations

Step 1: Create Migration Directory Structure

# Create migrations directory
mkdir -p ./migrations

# Create initial schema file
touch ./migrations/001_initial_schema.sql

Step 2: Example Database Schema

-- migrations/001_initial_schema.sql
-- Application database schema

-- Users table
CREATE TABLE users (
  id SERIAL PRIMARY KEY,
  email VARCHAR(255) UNIQUE NOT NULL,
  full_name VARCHAR(255),
  created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
  updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);

-- Add indexes for performance
CREATE INDEX idx_users_email ON users(email);

-- Enable any required extensions
-- CREATE EXTENSION IF NOT EXISTS "uuid-ossp";

Step 3: Seed Data (Optional)

-- migrations/002_seed_data.sql
-- Insert test data for development

INSERT INTO users (email, full_name) VALUES
  ('admin@example.com', 'Admin User'),
  ('test@example.com', 'Test User')
ON CONFLICT (email) DO NOTHING;

Docker Development Workflow

Step 1: Package.json Scripts

Add Docker-specific npm scripts:

{
  "scripts": {
    "docker:dev": "docker-compose up -d",
    "docker:stop": "docker-compose down",
    "docker:restart": "docker-compose restart",
    "docker:logs": "docker-compose logs -f",
    "docker:clean": "docker-compose down -v --remove-orphans",
    
    "dev": "your-dev-command",
    "build": "your-build-command",
    "start": "your-start-command",
    
    "db:migrate": "docker-compose exec database psql -U postgres -d postgres -f /docker-entrypoint-initdb.d/001_initial_schema.sql",
    "db:seed": "docker-compose exec database psql -U postgres -d postgres -f /docker-entrypoint-initdb.d/002_seed_data.sql",
    "db:reset": "docker-compose down database && docker-compose up -d database",
    
    "setup": "npm run docker:dev && sleep 10 && npm run db:migrate && npm run db:seed",
    "dev:full": "npm run setup && npm run docker:logs"
  }
}

Step 2: Development Commands

# Complete environment setup (first time)
npm run setup           # Start all services + migrate + seed

# Daily development workflow
npm run docker:dev      # Start all Docker services
npm run docker:logs     # View all service logs
npm run docker:stop     # Stop all services

# Individual service management
docker-compose up app        # Start only main application
docker-compose up database   # Start only database
docker-compose logs app      # View app logs only

# Database operations
npm run db:migrate      # Run database migrations
npm run db:seed         # Seed with test data
npm run db:reset        # Reset database completely

# Clean slate (nuclear option)
npm run docker:clean    # Remove all containers and volumes

Service Health Monitoring

Step 1: Health Check Utility

Create service monitoring utility:

// docker-utils.ts - Docker service health monitoring
export class DockerDevTools {
  static async checkServiceHealth(serviceName: string): Promise<boolean> {
    try {
      const response = await fetch(`http://localhost:${this.getServicePort(serviceName)}/health`)
      return response.ok
    } catch {
      return false
    }
  }

  private static getServicePort(service: string): number {
    const ports = {
      'app': 3000,
      'database': 5432,
      'redis': 6379,
      'maildev': 1080
    }
    return ports[service] || 3000
  }

  static async waitForServices(services: string[], timeout = 30000): Promise<boolean> {
    const start = Date.now()
    
    while (Date.now() - start < timeout) {
      const healthChecks = await Promise.all(
        services.map(service => this.checkServiceHealth(service))
      )
      
      if (healthChecks.every(healthy => healthy)) {
        return true
      }
      
      await new Promise(resolve => setTimeout(resolve, 1000))
    }
    
    return false
  }
}

Step 2: Service Verification

Verify all services are running correctly:

# Check all container status
docker-compose ps

# Test service endpoints
curl http://localhost:3000/health  # App health check
curl http://localhost:1080         # MailDev interface

# Check database connectivity
docker-compose exec database psql -U postgres -d postgres -c "SELECT version();"

# Check Redis connectivity
docker-compose exec redis redis-cli ping

Environment Verification Checklist

Your Docker environment is ready when you can:

  • Start entire stack with one command (npm run docker:dev)
  • Access all services through their respective ports
  • Make code changes with automatic container restart
  • Connect to database and create/read records
  • View service logs with docker-compose logs
  • Stop all services cleanly with npm run docker:stop
  • Reset environment completely with npm run docker:clean

Success Indicators

  1. Container Health: All services show "Up" in docker-compose ps
  2. Database Connectivity: Can connect and query database
  3. Application Access: Main application loads at configured port
  4. Hot Reload: Code changes trigger automatic restart
  5. Service Communication: Services can communicate internally
  6. Log Access: Can view logs for debugging

Troubleshooting Common Issues

Port Conflicts

# Check what's using a port
lsof -i :3000

# Stop conflicting services
sudo kill -9 $(lsof -t -i:3000)

Container Build Issues

# Force rebuild containers
docker-compose build --no-cache

# Remove all images and rebuild
docker-compose down --rmi all
docker-compose build

Volume Permission Issues

# Fix volume permissions (Linux/Mac)
sudo chown -R $(id -u):$(id -g) ./volumes/

# Windows: Run Docker Desktop as administrator

Database Connection Issues

# Check database logs
docker-compose logs database

# Reset database completely
docker-compose down database
docker volume rm $(docker volume ls -q | grep postgres)
docker-compose up -d database

Advanced Configuration

Multi-Environment Support

# docker-compose.override.yml - Local overrides
version: '3.8'
services:
  app:
    environment:
      - DEBUG=true
      - LOG_LEVEL=debug
    volumes:
      - ./logs:/app/logs

Production-Like Configuration

# docker-compose.prod.yml - Production simulation
version: '3.8'
services:
  app:
    build:
      dockerfile: Dockerfile.prod
    environment:
      - NODE_ENV=production
    restart: unless-stopped

This guide provides a complete Docker development setup that ensures consistency across all team members and environments. Customize the configuration files based on your specific technology stack and requirements.