Skip to content

Conversation

@Junyi-99
Copy link
Member

@Junyi-99 Junyi-99 commented Dec 11, 2025

This pull request introduces support for user-specific OpenAI API keys and custom LLM provider configurations, allowing users to select and use different language models based on their own API credentials. It refactors the chat message creation and streaming logic to leverage these settings, updates the underlying OpenAI client management, and adds an endpoint for listing supported models. Additionally, it updates protobuf-generated files and fixes minor code issues.

User settings and LLM provider support:

  • Added OpenAIAPIKey to the Settings model and mapping logic, enabling users to store and use their own OpenAI API keys. [1] [2] [3]
  • Introduced the LLMProviderConfig struct to encapsulate endpoint and API key configuration for LLM API calls, with logic to determine if custom settings are used.

Chat API and OpenAI client refactoring:

  • Refactored chat message creation and streaming (create_conversation_message.go, create_conversation_message_stream.go) to retrieve user settings, construct LLMProviderConfig, and pass it to AI client methods, enabling per-user model selection and authentication. [1] [2] [3] [4]
  • Updated AI client methods (ChatCompletion, ChatCompletionStream, GetConversationTitle) to accept and use LLMProviderConfig, and added logic to set the OpenAI client based on this config. [1] [2] [3] [4] [5]

Supported models endpoint:

  • Added ListSupportedModels API endpoint, returning available models based on whether the user has provided an OpenAI API key.

Dependency and generated code updates:

  • Upgraded openai-go dependency in go.mod to v2.7.1 for compatibility with new model slugs and features.
  • Regenerated protobuf files for auth and chat APIs, updating protoc-gen-go and fixing minor typos. [1] [2] [3] [4] [5] [6]

Note

Adds per-user OpenAI API key and provider config, refactors chat to use it, introduces a models listing API, updates protos, and integrates Grafana Faro with sourcemaps on the frontend.

  • Backend/Chat:
    • Add Settings.OpenAIAPIKey and mapping; introduce models.LLMProviderConfig.
    • Refactor chat flow to fetch user settings and pass llmProvider to AI client for completions/title.
    • AI client builds OpenAI client per-request from config; remove fixed client usage.
    • New ListSupportedModels RPC/REST (GET /_pd/api/v1/chats/models) returning models based on API key presence.
    • Extend language model enums/mappings (e.g., GPT5*, O*, Codex*); update request params handling.
    • Bump github.com/openai/openai-go/v2 to v2.7.1.
  • Protobuf/Generated:
    • Regenerate chat, user, auth, project, comment, shared protos; add SupportedModel/ListSupportedModels*; minor grpc/status text changes.
  • Frontend:
    • Add Settings UI for openai_api_key; persist via existing settings APIs.
    • Fetch models via new useListSupportedModelsQuery; map slug↔enum; update model selector UI.
    • Improve stream error handling by appending assistant error message; minor UI spacing/labels tweaks.
    • Integrate Grafana Faro (web SDK + tracing) and Vite plugin for sourcemap upload; enable sourcemaps; export GRAFANA_API_KEY in release workflow.
  • Misc:
    • Small fixes in query functions, styling, and keys.

Written by Cursor Bugbot for commit b93918f. This will update automatically on new commits. Configure here.

Copilot AI review requested due to automatic review settings December 11, 2025 12:21
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request implements user-specific OpenAI API key support and custom LLM provider configurations, enabling users to utilize different language models with their own credentials. The changes include frontend settings UI for API key management, backend refactoring for per-user model selection and authentication, a new endpoint for listing supported models, and comprehensive protobuf updates.

Key changes:

  • Added user-configurable OpenAI API keys with secure input handling in settings
  • Implemented LLMProviderConfig for flexible LLM endpoint and authentication configuration
  • Refactored chat API to support per-user model selection and custom API credentials
  • Added ListSupportedModels endpoint that returns available models based on user configuration
  • Upgraded OpenAI Go SDK to v2.7.1 and regenerated protobuf files with updated tooling

Reviewed changes

Copilot reviewed 50 out of 51 changed files in this pull request and generated 6 comments.

Show a summary per file
File Description
webapp/_webapp/src/views/settings/setting-text-input.tsx New reusable text input component with password masking for settings
webapp/_webapp/src/views/settings/sections/api-key-settings.tsx Settings section for OpenAI API key configuration
webapp/_webapp/src/hooks/useLanguageModels.ts Refactored to dynamically load supported models from backend API
webapp/_webapp/src/views/chat/footer/toolbar/selection.tsx Enhanced selection UI to support subtitle display for model slugs
webapp/_webapp/src/stores/conversation/handlers/handleStreamError.ts Improved error handling with visual error messages in chat
internal/models/llm_provider.go New model for LLM provider configuration
internal/services/toolkit/client/client.go Added SetOpenAIClient method for dynamic client configuration
internal/api/chat/list_supported_models.go New endpoint returning available models based on user settings
internal/api/chat/create_conversation_message_stream.go Refactored to use LLMProviderConfig for per-user authentication
internal/api/chat/create_conversation_message.go Updated to pass user settings to AI client methods
proto/user/v1/user.proto Added openai_api_key field to Settings message
proto/chat/v1/chat.proto Added ListSupportedModels RPC and SupportedModel message
go.mod Updated openai-go dependency from v2.1.1 to v2.7.1
Various protobuf generated files Regenerated with updated protoc-gen-go v1.36.10 and protoc-gen-go-grpc v1.6.0

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the final PR Bugbot will review for you during this billing cycle

Your free Bugbot reviews will reset on January 6

Details

You are on the Bugbot Free tier. On this plan, Bugbot will review limited PRs each billing cycle.

To receive Bugbot reviews on all of your PRs, visit the Cursor dashboard to activate Pro and start your 14-day free trial.

@Junyi-99 Junyi-99 requested a review from imwithye December 11, 2025 12:54
@Junyi-99 Junyi-99 self-assigned this Dec 11, 2025
@Junyi-99 Junyi-99 moved this from Backlog to In review in Project PaperDebugger Dec 11, 2025
@Junyi-99 Junyi-99 linked an issue Dec 11, 2025 that may be closed by this pull request
Copy link
Member Author

@Junyi-99 Junyi-99 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

self reviewed

@Junyi-99 Junyi-99 merged commit 2fba738 into main Dec 11, 2025
3 checks passed
@Junyi-99 Junyi-99 deleted the fix-no-document-bug branch December 11, 2025 14:27
@github-project-automation github-project-automation bot moved this from In review to Done in Project PaperDebugger Dec 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

Will this project support more advanced LLMs (Claude-4.5, GPT-5.1, etc.) in the future? How to solve "Chat Error"?

3 participants