-
Notifications
You must be signed in to change notification settings - Fork 62
feat: user provided api key & bug fix. #46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This pull request implements user-specific OpenAI API key support and custom LLM provider configurations, enabling users to utilize different language models with their own credentials. The changes include frontend settings UI for API key management, backend refactoring for per-user model selection and authentication, a new endpoint for listing supported models, and comprehensive protobuf updates.
Key changes:
- Added user-configurable OpenAI API keys with secure input handling in settings
- Implemented
LLMProviderConfigfor flexible LLM endpoint and authentication configuration - Refactored chat API to support per-user model selection and custom API credentials
- Added
ListSupportedModelsendpoint that returns available models based on user configuration - Upgraded OpenAI Go SDK to v2.7.1 and regenerated protobuf files with updated tooling
Reviewed changes
Copilot reviewed 50 out of 51 changed files in this pull request and generated 6 comments.
Show a summary per file
| File | Description |
|---|---|
webapp/_webapp/src/views/settings/setting-text-input.tsx |
New reusable text input component with password masking for settings |
webapp/_webapp/src/views/settings/sections/api-key-settings.tsx |
Settings section for OpenAI API key configuration |
webapp/_webapp/src/hooks/useLanguageModels.ts |
Refactored to dynamically load supported models from backend API |
webapp/_webapp/src/views/chat/footer/toolbar/selection.tsx |
Enhanced selection UI to support subtitle display for model slugs |
webapp/_webapp/src/stores/conversation/handlers/handleStreamError.ts |
Improved error handling with visual error messages in chat |
internal/models/llm_provider.go |
New model for LLM provider configuration |
internal/services/toolkit/client/client.go |
Added SetOpenAIClient method for dynamic client configuration |
internal/api/chat/list_supported_models.go |
New endpoint returning available models based on user settings |
internal/api/chat/create_conversation_message_stream.go |
Refactored to use LLMProviderConfig for per-user authentication |
internal/api/chat/create_conversation_message.go |
Updated to pass user settings to AI client methods |
proto/user/v1/user.proto |
Added openai_api_key field to Settings message |
proto/chat/v1/chat.proto |
Added ListSupportedModels RPC and SupportedModel message |
go.mod |
Updated openai-go dependency from v2.1.1 to v2.7.1 |
| Various protobuf generated files | Regenerated with updated protoc-gen-go v1.36.10 and protoc-gen-go-grpc v1.6.0 |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the final PR Bugbot will review for you during this billing cycle
Your free Bugbot reviews will reset on January 6
Details
You are on the Bugbot Free tier. On this plan, Bugbot will review limited PRs each billing cycle.
To receive Bugbot reviews on all of your PRs, visit the Cursor dashboard to activate Pro and start your 14-day free trial.
Junyi-99
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self reviewed
This pull request introduces support for user-specific OpenAI API keys and custom LLM provider configurations, allowing users to select and use different language models based on their own API credentials. It refactors the chat message creation and streaming logic to leverage these settings, updates the underlying OpenAI client management, and adds an endpoint for listing supported models. Additionally, it updates protobuf-generated files and fixes minor code issues.
User settings and LLM provider support:
OpenAIAPIKeyto theSettingsmodel and mapping logic, enabling users to store and use their own OpenAI API keys. [1] [2] [3]LLMProviderConfigstruct to encapsulate endpoint and API key configuration for LLM API calls, with logic to determine if custom settings are used.Chat API and OpenAI client refactoring:
create_conversation_message.go,create_conversation_message_stream.go) to retrieve user settings, constructLLMProviderConfig, and pass it to AI client methods, enabling per-user model selection and authentication. [1] [2] [3] [4]ChatCompletion,ChatCompletionStream,GetConversationTitle) to accept and useLLMProviderConfig, and added logic to set the OpenAI client based on this config. [1] [2] [3] [4] [5]Supported models endpoint:
ListSupportedModelsAPI endpoint, returning available models based on whether the user has provided an OpenAI API key.Dependency and generated code updates:
openai-godependency ingo.modto v2.7.1 for compatibility with new model slugs and features.authandchatAPIs, updating protoc-gen-go and fixing minor typos. [1] [2] [3] [4] [5] [6]Note
Adds per-user OpenAI API key and provider config, refactors chat to use it, introduces a models listing API, updates protos, and integrates Grafana Faro with sourcemaps on the frontend.
Settings.OpenAIAPIKeyand mapping; introducemodels.LLMProviderConfig.llmProviderto AI client for completions/title.ListSupportedModelsRPC/REST (GET /_pd/api/v1/chats/models) returning models based on API key presence.GPT5*,O*,Codex*); update request params handling.github.com/openai/openai-go/v2tov2.7.1.chat,user,auth,project,comment,sharedprotos; addSupportedModel/ListSupportedModels*; minor grpc/status text changes.openai_api_key; persist via existing settings APIs.useListSupportedModelsQuery; map slug↔enum; update model selector UI.GRAFANA_API_KEYin release workflow.Written by Cursor Bugbot for commit b93918f. This will update automatically on new commits. Configure here.