Load balancer for ChatGPT accounts. Pool multiple accounts, track usage, manage API keys, view everything in a dashboard.
![]() |
![]() |
|---|
| Account Pooling Load balance across multiple ChatGPT accounts |
Usage Tracking Per-account tokens, cost, 28-day trends |
API Keys Per-key rate limits by token, cost, window, model |
| Dashboard Auth Password + optional TOTP |
OpenAI-compatible Codex CLI, OpenCode, any OpenAI client |
Auto Model Sync Available models fetched from upstream |
# Docker (recommended)
docker volume create codex-lb-data
docker run -d --name codex-lb \
-p 2455:2455 -p 1455:1455 \
-v codex-lb-data:/var/lib/codex-lb \
ghcr.io/soju06/codex-lb:latest
# or uvx
uvx codex-lbOpen localhost:2455 → Add account → Done.
Point any OpenAI-compatible client at codex-lb. If API key auth is enabled, pass a key from the dashboard as a Bearer token.
Codex CLI / IDE Extension
~/.codex/config.toml:
model = "gpt-5.3-codex"
model_reasoning_effort = "xhigh"
model_provider = "codex-lb"
[model_providers.codex-lb]
name = "OpenAI" # required — enables remote /responses/compact
base_url = "http://127.0.0.1:2455/backend-api/codex"
wire_api = "responses"With API key auth:
[model_providers.codex-lb]
name = "OpenAI"
base_url = "http://127.0.0.1:2455/backend-api/codex"
wire_api = "responses"
env_key = "CODEX_LB_API_KEY"export CODEX_LB_API_KEY="sk-clb-..." # key from dashboard
codexMigrating from direct OpenAI — codex resume filters by model_provider;
old sessions won't appear until you re-tag them:
# JSONL session files (all versions)
find ~/.codex/sessions -name '*.jsonl' \
-exec sed -i '' 's/"model_provider":"openai"/"model_provider":"codex-lb"/g' {} +
# SQLite state DB (>= v0.105.0, creates ~/.codex/state_*.sqlite)
sqlite3 ~/.codex/state_5.sqlite \
"UPDATE threads SET model_provider = 'codex-lb' WHERE model_provider = 'openai';"
OpenCode
~/.config/opencode/opencode.json:
This keeps OpenCode's default providers/connections available and adds codex-lb as an extra selectable provider.
If you use enabled_providers, include every provider you want to keep plus codex-lb; otherwise non-listed providers are hidden.
With API key auth:
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"codex-lb": {
"npm": "@ai-sdk/openai-compatible",
"name": "codex-lb",
"options": {
"baseURL": "http://127.0.0.1:2455/v1",
"apiKey": "{env:CODEX_LB_API_KEY}" // reads from env var
},
"models": {
"gpt-5.3-codex": { "name": "GPT-5.3 Codex", "reasoning": true, "interleaved": { "field": "reasoning_details" },"options": { "reasoningEffort": "medium"} }
}
}
},
"model": "codex-lb/gpt-5.3-codex"
}export CODEX_LB_API_KEY="sk-clb-..." # key from dashboard
opencode
OpenClaw
~/.openclaw/openclaw.json:
{
"agents": {
"defaults": {
"model": { "primary": "codex-lb/gpt-5.3-codex" }
}
},
"models": {
"mode": "merge",
"providers": {
"codex-lb": {
"baseUrl": "http://127.0.0.1:2455/v1",
"apiKey": "${CODEX_LB_API_KEY}", // or "dummy" if API key auth is disabled
"api": "openai-completions",
"models": [
{ "id": "gpt-5.3-codex", "name": "GPT-5.3 Codex" },
{ "id": "gpt-5.3-codex-spark", "name": "GPT-5.3 Codex Spark" }
]
}
}
}
}Set the env var or replace ${CODEX_LB_API_KEY} with a key from the dashboard. If API key auth is disabled, any value works.
OpenAI Python SDK
from openai import OpenAI
client = OpenAI(
base_url="http://127.0.0.1:2455/v1",
api_key="sk-clb-...", # from dashboard, or any string if auth is disabled
)
response = client.chat.completions.create(
model="gpt-5.3-codex",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)API key auth is disabled by default — the proxy is open to any client. Enable it in Settings → API Key Auth on the dashboard.
When enabled, clients must pass a valid API key as a Bearer token:
Authorization: Bearer sk-clb-...
Creating keys: Dashboard → API Keys → Create. The full key is shown only once at creation. Keys support optional expiration, model restrictions, and rate limits (tokens / cost per day / week / month).
Environment variables with CODEX_LB_ prefix or .env.local. See .env.example.
Dashboard auth is configured in Settings.
SQLite is the default database backend; PostgreSQL is optional via CODEX_LB_DATABASE_URL (for example postgresql+asyncpg://...).
| Environment | Path |
|---|---|
| Local / uvx | ~/.codex-lb/ |
| Docker | /var/lib/codex-lb/ |
Backup this directory to preserve your data.
# Docker
docker compose watch
# Local
uv sync && cd frontend && bun install && cd ..
uv run fastapi run app/main.py --reload # backend :2455
cd frontend && bun run dev # frontend :5173Thanks goes to these wonderful people (emoji key):
This project follows the all-contributors specification. Contributions of any kind welcome!







{ "$schema": "https://opencode.ai/config.json", "provider": { "codex-lb": { "npm": "@ai-sdk/openai-compatible", "name": "codex-lb", "options": { "baseURL": "http://127.0.0.1:2455/v1" }, "models": { "gpt-5.3-codex": { "name": "GPT-5.3 Codex", "reasoning": true, "interleaved": { "field": "reasoning_details" },"options": { "reasoningEffort": "medium"} } } } }, "model": "codex-lb/gpt-5.3-codex" }