Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
159 changes: 159 additions & 0 deletions .devcontainer/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,159 @@
# Pydantic AI DevContainer Environment Variables
# Copy this file to .env and fill in your actual values

# ============================================================================
# MODEL PROVIDER API KEYS
# ============================================================================

# OpenAI (Required for: OpenAI models, OpenAI-compatible providers)
# Get your key at: https://platform.openai.com/api-keys
OPENAI_API_KEY=

# Anthropic (Required for: Claude models)
# Get your key at: https://console.anthropic.com/settings/keys
ANTHROPIC_API_KEY=

# Google Generative AI (Required for: Gemini models via Google AI Studio)
# Get your key at: https://aistudio.google.com/apikey
GEMINI_API_KEY=

# Google Cloud (Required for: Gemini models via Vertex AI)
# Service account JSON content (not a file path)
# Get it from: https://console.cloud.google.com/iam-admin/serviceaccounts
GOOGLE_SERVICE_ACCOUNT_CONTENT=

# Groq (Required for: Groq models)
# Get your key at: https://console.groq.com/keys
GROQ_API_KEY=

# Mistral AI (Required for: Mistral models)
# Get your key at: https://console.mistral.ai/api-keys
MISTRAL_API_KEY=

# Cohere (Required for: Cohere models)
# Get your key at: https://dashboard.cohere.com/api-keys
CO_API_KEY=

# AWS Bedrock (Required for: AWS Bedrock models)
# Configure via AWS CLI or set these:
# AWS_ACCESS_KEY_ID=
# AWS_SECRET_ACCESS_KEY=
# AWS_REGION=us-east-1

# ============================================================================
# ADDITIONAL MODEL PROVIDERS (OpenAI-compatible)
# ============================================================================

# DeepSeek (OpenAI-compatible)
# Get your key at: https://platform.deepseek.com/api_keys
DEEPSEEK_API_KEY=

# xAI Grok (OpenAI-compatible)
# Get your key at: https://console.x.ai/
GROK_API_KEY=

# OpenRouter (Aggregates multiple providers)
# Get your key at: https://openrouter.ai/settings/keys
OPENROUTER_API_KEY=

# Vercel AI Gateway
# Configure at: https://vercel.com/docs/ai-gateway
VERCEL_AI_GATEWAY_API_KEY=

# Fireworks AI (OpenAI-compatible)
# Get your key at: https://fireworks.ai/api-keys
FIREWORKS_API_KEY=

# Together AI (OpenAI-compatible)
# Get your key at: https://api.together.ai/settings/api-keys
TOGETHER_API_KEY=

# Cerebras (OpenAI-compatible)
# Get your key at: https://cloud.cerebras.ai/
CEREBRAS_API_KEY=

# Nebius AI (OpenAI-compatible)
# Get your key at: https://studio.nebius.ai/
NEBIUS_API_KEY=

# OVHcloud AI Endpoints (OpenAI-compatible)
# Get your key at: https://endpoints.ai.cloud.ovh.net/
OVHCLOUD_API_KEY=

# MoonshotAI (OpenAI-compatible)
# Get your key at: https://platform.moonshot.cn/
MOONSHOTAI_API_KEY=

# Heroku Inference (OpenAI-compatible)
# Get your key at: https://www.heroku.com/ai
HEROKU_INFERENCE_KEY=

# ============================================================================
# LOCAL MODEL PROVIDERS
# ============================================================================

# Ollama (Optional - for local models)
# If running Ollama locally or via docker-compose, set the base URL
# Default when using docker-compose ollama service:
# OLLAMA_BASE_URL=http://localhost:11434/v1/
# OLLAMA_API_KEY=placeholder # Not needed for local, but some tools require it

# ============================================================================
# OBSERVABILITY & MONITORING
# ============================================================================

# Logfire (Optional - for structured logging and tracing)
# Get your token at: https://logfire.pydantic.dev/
# LOGFIRE_TOKEN=
# LOGFIRE_SERVICE_NAME=pydantic-ai-dev

# ============================================================================
# SEARCH PROVIDERS (for tool integrations)
# ============================================================================

# Brave Search (Optional - for web search tools)
# Get your key at: https://brave.com/search/api/
# BRAVE_API_KEY=

# Tavily Search (Optional - for web search tools)
# Get your key at: https://tavily.com/
# TAVILY_API_KEY=

# ============================================================================
# MODEL CONTEXT PROTOCOL (MCP)
# ============================================================================

# GitHub Personal Access Token (Optional - for MCP GitHub server)
# Create at: https://github.com/settings/tokens
# Needs: repo, read:org scopes
# GITHUB_PERSONAL_ACCESS_TOKEN=

# ============================================================================
# DATABASE CONNECTIONS (for examples)
# ============================================================================

# PostgreSQL (Optional - for SQL/RAG examples)
# Default when using docker-compose postgres service:
# DATABASE_URL=postgresql://postgres:postgres@localhost:54320/postgres

# PostgreSQL with pgvector (Optional - for RAG examples)
# Default when using docker-compose pgvector service:
# PGVECTOR_DATABASE_URL=postgresql://postgres:postgres@localhost:54321/postgres

# ============================================================================
# TESTING FLAGS
# ============================================================================

# Enable live API testing (Optional - USE WITH CAUTION - incurs API costs!)
# Set to exact value below to enable live tests that hit real APIs
# PYDANTIC_AI_LIVE_TEST_DANGEROUS=CHARGE-ME!

# ============================================================================
# NOTES
# ============================================================================
#
# - Most API keys are OPTIONAL - only set the ones you plan to use
# - For testing, use test models or Ollama to avoid API costs
# - Never commit this file with real API keys
# - Add .env to .gitignore (already done in this project)
# - See README.md for detailed setup instructions per provider
190 changes: 190 additions & 0 deletions .devcontainer/AGENTS.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,190 @@
# DevContainer Maintenance Guide

## About This Codebase

- **Pydantic AI**: Agent framework for building LLM-powered applications with Pydantic
- **Workspace structure**: uv monorepo with multiple packages
- `pydantic-ai-slim`: Core framework (minimal dependencies)
- `pydantic-evals`: Evaluation framework
- `pydantic-graph`: Graph execution engine
- `examples/`: Example applications
- `clai/`: CLI tool
- **Primary users**: Contributors, AI coding agents (Claude Code, Cursor), PR reviewers

## DevContainer Purpose

- Provides isolated, reproducible development environment
- Matches exact dependencies and tools across all developers and AI agents
- Prevents "works on my machine" issues
- Ensures AI agents have proper access to testing/building tools
- Security isolation for AI agents

## Platform Configuration

- **Default platform**: `linux/amd64` (x86_64)
- **Why not ARM64**: Some Python packages (e.g., mlx) lack Linux ARM64 wheels
- **Apple Silicon**: Uses Rosetta/QEMU emulation automatically (slightly slower but compatible)
- **Change if needed**: Edit `docker-compose.yml` platform setting

## Installation Modes

### Standard Mode (Default)
- Installs: Cloud API providers + dev tools
- Excludes: PyTorch, transformers, vLLM, outlines ML extras
- Use case: 95% of development (PR testing, features, bug fixes)
- Why: Saves significant install time and disk space
- Command: Uses explicit `--extra` flags in `install.sh`

### Full Mode
- Installs: Everything including ML frameworks
- Use case: Working on outlines integration, local model features
- Command: `--all-extras --all-packages`

### Mode Selection
- Interactive (VSCode): User prompted to choose
- Non-interactive (agents/CI): Defaults to Standard
- Override: Set `INSTALL_MODE=standard|full` environment variable

## Key Files

### `.devcontainer/devcontainer.json`
- VSCode configuration for the devcontainer
- Editor settings, extensions, port forwarding
- Lifecycle commands (`postCreateCommand`, `postStartCommand`)
- Environment variables (`UV_LINK_MODE`, `UV_PROJECT_ENVIRONMENT`, etc.)
- Git identity for AI commits

### `.devcontainer/Dockerfile`
- Base image: `mcr.microsoft.com/devcontainers/base:debian-12`
- System dependencies for Python builds
- Installs: uv, deno, pre-commit, Python 3.12
- Runs as non-root user `vscode`

### `.devcontainer/docker-compose.yml`
- Service orchestration
- Platform specification (`linux/amd64`)
- Optional services (commented out): Ollama, PostgreSQL, pgvector, MCP proxy
- Volume management for persistence

### `.devcontainer/install.sh`
- Interactive installation script
- Detects interactive vs non-interactive mode
- Implements Standard vs Full installation logic
- Installs pre-commit hooks
- Called by `postCreateCommand` in devcontainer.json

## Environment Variables

### Critical Variables (devcontainer.json)
- `UV_PROJECT_ENVIRONMENT=/workspace/.venv`: Virtual environment location
- `UV_LINK_MODE=copy`: Suppress hardlink warnings in Docker volumes
- `PYTHONUNBUFFERED=1`: Ensure Python output appears immediately
- `COLUMNS=150`: Terminal width for better output formatting
- `GIT_AUTHOR_*`, `GIT_COMMITTER_*`: Git identity for AI commits

### Optional Variables
- `INSTALL_MODE=standard|full`: Override installation mode
- API keys: Should be set in `.devcontainer/.env` (not committed)

## Dependencies and Extras

### Always Installed (Both Modes)
- Core: pydantic, httpx, opentelemetry-api
- Cloud APIs: openai, anthropic, google, groq, mistral, cohere, bedrock, huggingface
- Dev tools: cli, mcp, fastmcp, logfire, retries, temporal, ui, ag-ui, evals
- Build tools: lint group, docs group (ruff, mypy, pyright, mkdocs)

### Only in Full Mode
- `outlines-transformers`: PyTorch + Transformers library
- `outlines-vllm-offline`: vLLM inference engine
- `outlines-sglang`: SGLang framework
- `outlines-mlxlm`: Apple MLX framework
- `outlines-llamacpp`: LlamaCPP bindings

## Common Maintenance Tasks

### Adding New System Dependencies
- Edit `Dockerfile`: Add to `apt-get install` command
- Rebuild container required

### Adding Python Packages
- Use `uv add package-name` (not manual pyproject.toml edits)
- For new extras: Add to `pydantic_ai_slim/pyproject.toml` optional-dependencies
- Update `install.sh` if extra should be in Standard mode

### Adding VSCode Extensions
- Edit `devcontainer.json`: Add to `customizations.vscode.extensions` array
- Rebuild container required

### Updating Base Image/Tools
- `Dockerfile`: Change base image tag
- Update uv/deno install commands if needed
- Test with both Standard and Full modes

### Adding Optional Services
- Uncomment service in `docker-compose.yml`
- Uncomment corresponding volume if needed
- Document in README.md optional services section

## Troubleshooting

### Container Build Fails
- Check Docker daemon is running
- Check internet connectivity
- Try: `docker system prune -a` to clean cache
- Check Dockerfile for syntax errors: `docker build -f .devcontainer/Dockerfile .`

### Installation Script Fails
- Check `install.sh` syntax: `bash -n .devcontainer/install.sh`
- Run manually in container to see detailed errors
- Check uv lockfile is up to date: `uv lock`

### Performance Issues
- Verify Docker resources (4+ GB RAM recommended)
- Check platform setting (amd64 vs arm64)
- Volume cache consistency setting in docker-compose.yml

### UV Warnings
- Hardlink warning: Ensure `UV_LINK_MODE=copy` is set
- Lockfile conflicts: Run `uv lock` to regenerate

## Best Practices

### When Changing install.sh
- Test both Standard and Full modes
- Test interactive and non-interactive flows
- Verify syntax with `bash -n install.sh`
- Update README.md to match

### When Changing Dependencies
- Keep Standard mode lean (exclude heavy ML frameworks)
- Update install.sh if adding new extras
- Document in README.md what's included/excluded
- Test install time impact

### When Updating Documentation
- Keep README.md user-facing and comprehensive
- Keep CLAUDE.md maintainer-focused and concise
- No time estimates (machine-dependent)
- Link to official docs where applicable

## Git Configuration

- Credentials forwarded automatically by VSCode (no manual setup needed)
- Git identity set via environment variables (not .gitconfig file)
- Safe directory configured in `postStartCommand`
- AI commits use identity from `GIT_AUTHOR_*` variables

## Testing the Setup

### Manual Test
1. Make changes to devcontainer files
2. Rebuild container: "Dev Containers: Rebuild Container"
3. Test Standard mode installation
4. Test Full mode: `INSTALL_MODE=full` or run `make install`
5. Verify: `uv run pytest tests/test_agent.py::test_simple_sync`

### CI Considerations
- Container should work in non-interactive mode
- Default Standard mode should cover 95% of test suite
- Full mode needed only for outlines/ML framework tests
Loading