A production-ready Python configuration management system built with Pydantic that demonstrates best practices for handling application configuration from multiple sources with type validation and environment-specific overrides.
Traditional configuration management often leads to:
This template solves the typical problems caused by lack of standardized configuration management by providing:
- Type Safety: Catch configuration errors at startup, not at runtime, using the Pydantic library
- Multiple Sources: Environment variables, .env files, secrets, and YAML
- Environment Flexibility: Easy overrides for development, staging, and production
- Nested Configuration: Organize complex settings hierarchically
- Security: Built-in support for secrets and sensitive data masking (using types like SecretStr) which are automagically masked during logging
- Debugging: Pythonic access patterns for config variables which work well with linters & tools like mypy
Production Setup (using defaults):
# Your production code works with sensible defaults for env variables (provided during pydantic class declaration)
config = get_config()
print(config.AWS_CONFIG.AWS_PROFILE) # "default"
# AWS SDK automatically uses the default profile configured on your production server
s3_client = boto3.client('s3',
profile_name=config.AWS_CONFIG.AWS_PROFILE)Local Development (easy overrides):
# Override for local development without changing code
export AWS_CONFIG__AWS_PROFILE="dev-profile"
python src/main.py # Now uses "dev-profile"
# Or use a .env file for team consistency
echo 'AWS_CONFIG__AWS_PROFILE="dev-profile"' > .env
# Or create a config.yaml file for readability
#AWS_CONFIG:
# AWS_PROFILE: "dev-profile"Nested configuration with intelligent defaults (Each nested element is separated by 2 underscores):
# Production uses environment variables
# DB__HOST=prod-db.company.com
# DB__PORT=5432
config = get_config()
db_url = f"postgresql://{config.DB.HOST}:{config.DB.PORT}/app"Local development override:
# .env file for local development
DB__HOST=localhost
DB__PORT=5433- Docker Secrets (
/run/secrets) - Highest priority, for production secrets - Environment Variables - Great for CI/CD and containerized deployments
.envfiles - Perfect for local development and team consistencyconfig.yaml- Optional structured configuration- Default Values - Sensible defaults defined in models
What does this mean? If a variable is defined in both .env and config.yaml, the value in the higher priority source is preferred (.env in this case)
Install uv package manager
-
Install dependencies:
uv sync
-
Run with defaults:
python src/main.py # Or run `make dump_config_to_yaml`You'll see default configuration values for all settings.
-
Try environment-specific overrides:
# Copy example environment file cp .env.example .env # Run again to see overridden values python src/main.py
-
Test different override methods:
# Environment variable override export SERVICE_NAME="MyDevService" python src/main.py
Each configuration logical group is inherited from the Pydantic BaseModel. For example, this template uses the below config:
AWSConfig: AWS services (S3, CloudWatch, profiles)DatabaseConfig: Database connection settingsLLMConfig: Language model configurationsLookupFilesConfig: File and directory paths with validation
These are inherited by the AppConfig container (in consolidated.py) which can be consumed across the app using the get_config convenience function.
from config import get_config
config = get_config()
# Access nested values
aws_profile = config.AWS_CONFIG.AWS_PROFILE
bucket_name = config.AWS_CONFIG.S3_BUCKET_NAMES.BUCKET_A
db_host = config.DB.HOST
# Configuration is cached and validated once at startupUse double underscores (__) for nested configuration:
# Top-level
SERVICE_NAME="MyApp"
# Nested: config.DB.HOST
DB__HOST="localhost"
# Double nested: config.AWS_CONFIG.S3_BUCKET_NAMES.BUCKET_A
AWS_CONFIG__S3_BUCKET_NAMES__BUCKET_A="my-dev-bucket"- Run demo:
python src/main.py - Export config:
make dump_config_to_yaml - Run tests:
make test