Add OPENAI_BASE_URL environment variable support#176
Merged
Conversation
Allow users to configure custom OpenAI API endpoints via the OPENAI_BASE_URL environment variable, enabling use with internal proxies, enterprise gateways, and OpenAI-compatible providers. - OpenAIConfig now reads base_url from OPENAI_BASE_URL env var - Works automatically with MCP server and all kit LLM features - Added tests for env var configuration - Updated docs: README, MCP server docs, LLM configuration guide
tnm
added a commit
that referenced
this pull request
Jan 5, 2026
- Add OPENAI_BASE_URL environment variable support (#176)
tnm
added a commit
that referenced
this pull request
Jan 5, 2026
- Add OPENAI_BASE_URL environment variable support (#176)
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds support for the
OPENAI_BASE_URLenvironment variable, allowing users to configure custom OpenAI API endpoints for use with:This addresses the need for organizations with internal proxy URLs to access OpenAI models through Kit's MCP server.
Changes
OpenAIConfig.base_urlnow reads fromOPENAI_BASE_URLenv var by defaultUsage
Or in MCP config:
{ "mcpServers": { "kit-dev-mcp": { "command": "uvx", "args": ["--from", "cased-kit", "kit-dev-mcp"], "env": { "OPENAI_API_KEY": "your-key", "OPENAI_BASE_URL": "https://your-proxy.company.com/v1" } } } }Test plan