-
Notifications
You must be signed in to change notification settings - Fork 2
Adding Support for local OpenAI-compatible APIs #4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
- Add custom base URL support to OpenAI provider via OPENAI_BASE_URL env var - Add dedicated localopenai provider for better local API experience - Update documentation with local API setup instructions - Support popular local servers (LM Studio, Ollama, LocalAI, etc.) - Add .mcp-todos.json to gitignore
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
Adds support for local OpenAI-compatible APIs by extending the CLI’s LLM service and updating documentation.
- Introduce a new
localopenai
provider inllm-service.ts
with default endpoint and key fallback - Enhance the existing
openai
provider to accept customOPENAI_BASE_URL
environment variables - Extend
readme.md
with detailed usage instructions and examples for local API servers
Reviewed Changes
Copilot reviewed 2 out of 3 changed files in this pull request and generated 2 comments.
File | Description |
---|---|
source/services/llm-service.ts | Added localopenai provider and updated openai to support custom base URLs |
readme.md | Documented usage for local OpenAI-compatible APIs with examples |
Comments suppressed due to low confidence (1)
source/services/llm-service.ts:45
- New
localopenai
provider logic has been added without accompanying tests; consider adding unit tests to verify factory behavior under various environment and configuration scenarios.
localopenai: {
factory: (key: string, cfg: LLMConfig) => | ||
new ChatOpenAI({openAIApiKey: key, modelName: cfg.model}), | ||
factory: (key: string, cfg: LLMConfig) => { | ||
const baseURL = process.env['OPENAI_BASE_URL'] || process.env['OPENAI_API_BASE']; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] The base URL normalization logic is duplicated between the openai
and localopenai
providers; consider extracting this into a shared helper to reduce duplication and improve maintainability.
Copilot uses AI. Check for mistakes.
const baseURL = process.env['LOCAL_OPENAI_BASE_URL'] || 'http://localhost:1234/v1'; | ||
const config: any = { | ||
openAIApiKey: key || 'local-api-key', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Using a hardcoded fallback API key ('local-api-key'
) may lead to sending misleading credentials; consider allowing an empty key or requiring an explicit local key to avoid accidental misuse.
const baseURL = process.env['LOCAL_OPENAI_BASE_URL'] || 'http://localhost:1234/v1'; | |
const config: any = { | |
openAIApiKey: key || 'local-api-key', | |
if (!key) { | |
throw new Error("API key for 'localopenai' provider must be explicitly provided."); | |
} | |
const baseURL = process.env['LOCAL_OPENAI_BASE_URL'] || 'http://localhost:1234/v1'; | |
const config: any = { | |
openAIApiKey: key, |
Copilot uses AI. Check for mistakes.
This pull request introduces support for local OpenAI-compatible APIs, enabling users to integrate local models and servers into the CLI tool. The changes include updates to the documentation, new configuration options for local APIs, and enhancements to the LLM service to handle these providers.
Documentation Updates:
readme.md
: Added instructions for using local OpenAI-compatible APIs, including configuration options for setting base URLs and API keys for local servers like LM Studio, Ollama, LocalAI, and Text Generation WebUI.readme.md
: Updated the list of supported models to include thelocalopenai
provider for local APIs.Code Enhancements:
source/services/llm-service.ts
: Added a newlocalopenai
provider to thePROVIDERS
object. This includes logic for handling local API endpoints, default models, and API keys, with fallback values for local use cases. Updated theopenai
provider to support custom base URLs via environment variables.