Skip to content

Adding Support for local OpenAI-compatible APIs #4

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

jasonpaulso
Copy link

This pull request introduces support for local OpenAI-compatible APIs, enabling users to integrate local models and servers into the CLI tool. The changes include updates to the documentation, new configuration options for local APIs, and enhancements to the LLM service to handle these providers.

Documentation Updates:

  • readme.md: Added instructions for using local OpenAI-compatible APIs, including configuration options for setting base URLs and API keys for local servers like LM Studio, Ollama, LocalAI, and Text Generation WebUI.
  • readme.md: Updated the list of supported models to include the localopenai provider for local APIs.

Code Enhancements:

  • source/services/llm-service.ts: Added a new localopenai provider to the PROVIDERS object. This includes logic for handling local API endpoints, default models, and API keys, with fallback values for local use cases. Updated the openai provider to support custom base URLs via environment variables.

Jason Schulz added 2 commits July 11, 2025 14:12
- Add custom base URL support to OpenAI provider via OPENAI_BASE_URL env var
- Add dedicated localopenai provider for better local API experience
- Update documentation with local API setup instructions
- Support popular local servers (LM Studio, Ollama, LocalAI, etc.)
- Add .mcp-todos.json to gitignore
@Copilot Copilot AI review requested due to automatic review settings July 11, 2025 18:16
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

Adds support for local OpenAI-compatible APIs by extending the CLI’s LLM service and updating documentation.

  • Introduce a new localopenai provider in llm-service.ts with default endpoint and key fallback
  • Enhance the existing openai provider to accept custom OPENAI_BASE_URL environment variables
  • Extend readme.md with detailed usage instructions and examples for local API servers

Reviewed Changes

Copilot reviewed 2 out of 3 changed files in this pull request and generated 2 comments.

File Description
source/services/llm-service.ts Added localopenai provider and updated openai to support custom base URLs
readme.md Documented usage for local OpenAI-compatible APIs with examples
Comments suppressed due to low confidence (1)

source/services/llm-service.ts:45

  • New localopenai provider logic has been added without accompanying tests; consider adding unit tests to verify factory behavior under various environment and configuration scenarios.
	localopenai: {

factory: (key: string, cfg: LLMConfig) =>
new ChatOpenAI({openAIApiKey: key, modelName: cfg.model}),
factory: (key: string, cfg: LLMConfig) => {
const baseURL = process.env['OPENAI_BASE_URL'] || process.env['OPENAI_API_BASE'];
Copy link
Preview

Copilot AI Jul 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The base URL normalization logic is duplicated between the openai and localopenai providers; consider extracting this into a shared helper to reduce duplication and improve maintainability.

Copilot uses AI. Check for mistakes.

Comment on lines +49 to +51
const baseURL = process.env['LOCAL_OPENAI_BASE_URL'] || 'http://localhost:1234/v1';
const config: any = {
openAIApiKey: key || 'local-api-key',
Copy link
Preview

Copilot AI Jul 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Using a hardcoded fallback API key ('local-api-key') may lead to sending misleading credentials; consider allowing an empty key or requiring an explicit local key to avoid accidental misuse.

Suggested change
const baseURL = process.env['LOCAL_OPENAI_BASE_URL'] || 'http://localhost:1234/v1';
const config: any = {
openAIApiKey: key || 'local-api-key',
if (!key) {
throw new Error("API key for 'localopenai' provider must be explicitly provided.");
}
const baseURL = process.env['LOCAL_OPENAI_BASE_URL'] || 'http://localhost:1234/v1';
const config: any = {
openAIApiKey: key,

Copilot uses AI. Check for mistakes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant