Skip to content

Conversation

@zerob13
Copy link
Collaborator

@zerob13 zerob13 commented Oct 22, 2025

Summary

  • add ZenMux to the default provider catalog with website references
  • implement a ZenMux provider based on the OpenAI-compatible presenter and register it in the presenter factory
  • include the ZenMux logo in the renderer icon mapping for display

Testing

  • pnpm run typecheck

https://chatgpt.com/codex/tasks/task_e_68f8fb0d2be0832c81511a782b766bc3

Summary by CodeRabbit

Release Notes

  • New Features
    • Added ZenMux as a new supported LLM provider
    • ZenMux models are now available for selection and use
    • Integrated ZenMux visual branding and model labeling

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 22, 2025

Walkthrough

This pull request adds support for a new LLM provider, ZenMux, by registering it in the default provider configuration, implementing a provider class that extends OpenAICompatibleProvider to fetch and annotate models, wiring the provider in the routing layer, and registering an associated icon for the UI.

Changes

Cohort / File(s) Summary
Provider Configuration
src/main/presenter/configPresenter/providers.ts
Added ZenMux provider entry to DEFAULT_PROVIDERS with full configuration including id, name, apiType, baseUrl, websites mapping, and related metadata.
Provider Routing & Implementation
src/main/presenter/llmProviderPresenter/index.ts, src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
Imported ZenmuxProvider and added routing cases for 'zenmux' provider type. Created new ZenmuxProvider class extending OpenAICompatibleProvider that overrides fetchOpenAIModels to annotate returned models with 'ZenMux' group label.
UI Icon Registration
src/renderer/src/components/icons/ModelIcon.vue
Imported zenmux-color.svg icon and registered it in the icons mapping under 'zenmux' key for model UI display.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Changes follow an established provider integration pattern across all modified files. Modifications are straightforward configuration additions and minimal logic overrides with no complex control flow alterations.

Possibly related PRs

  • feat: add Vercel AI Gateway provider support #743: Adds a new OpenAI-compatible LLM provider (Vercel) by updating DEFAULT_PROVIDERS, wiring provider routing, creating an OpenAICompatibleProvider subclass, and registering icons—directly parallel to this ZenMux integration.
  • feat: add jiekou.ai as LLM provider #1041: Implements a new OpenAI-compatible LLM provider using the same code-level pattern of updating DEFAULT_PROVIDERS, adding llmProviderPresenter routing, extending OpenAICompatibleProvider with fetchOpenAIModels override, and registering ModelIcon assets.
  • Feature/302 provider model api #582: Adds new LLM provider implementations with custom model-fetching behavior by extending providers and overriding fetchOpenAIModels, following the same architectural approach.

Poem

🐰 A new friend joins the warren today,
ZenMux hops down the LLM way,
With models grouped and icons bright,
The config chains feel oh-so-right,
Another provider, neat and tidy—
The system grows both deep and wide-y!

Pre-merge checks and finishing touches

✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The pull request title "feat: add zenmux provider" is fully related to the main change in the changeset. The entire pull request is focused on a single, cohesive objective: adding support for the ZenMux provider across the codebase. This includes configuration setup, provider implementation, routing registration, and UI icon integration. The title accurately and concisely captures this primary purpose without vague language or noise, and a developer scanning the git history would immediately understand the scope of this change.
Docstring Coverage ✅ Passed No functions found in the changes. Docstring coverage check skipped.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch codex/add-zenmux-as-new-provider

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@zerob13 zerob13 merged commit 9fe8dad into dev Oct 22, 2025
1 of 2 checks passed
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts (1)

4-7: Consider adding a documentation comment.

The implementation is correct and follows the established pattern for OpenAI-compatible providers. However, adding a brief JSDoc comment explaining what ZenMux is would improve code maintainability.

Apply this diff to add documentation:

+/**
+ * ZenMux provider implementation
+ * ZenMux is an OpenAI-compatible LLM API provider
+ * @see https://zenmux.ai/
+ */
 export class ZenmuxProvider extends OpenAICompatibleProvider {
   constructor(provider: LLM_PROVIDER, configPresenter: IConfigPresenter) {
     super(provider, configPresenter)
   }
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 0e8fa30 and cfef5ad.

⛔ Files ignored due to path filters (1)
  • src/renderer/src/assets/llm-icons/zenmux-color.svg is excluded by !**/*.svg
📒 Files selected for processing (4)
  • src/main/presenter/configPresenter/providers.ts (1 hunks)
  • src/main/presenter/llmProviderPresenter/index.ts (3 hunks)
  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts (1 hunks)
  • src/renderer/src/components/icons/ModelIcon.vue (2 hunks)
🧰 Additional context used
📓 Path-based instructions (25)
**/*.{js,jsx,ts,tsx}

📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)

**/*.{js,jsx,ts,tsx}: 使用 OxLint 进行代码检查
Log和注释使用英文书写

Files:

  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/index.ts
  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
src/{main,renderer}/**/*.ts

📄 CodeRabbit inference engine (.cursor/rules/electron-best-practices.mdc)

src/{main,renderer}/**/*.ts: Use context isolation for improved security
Implement proper inter-process communication (IPC) patterns
Optimize application startup time with lazy loading
Implement proper error handling and logging for debugging

Files:

  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/index.ts
  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
src/main/**/*.ts

📄 CodeRabbit inference engine (.cursor/rules/electron-best-practices.mdc)

Use Electron's built-in APIs for file system and native dialogs

Files:

  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/index.ts
  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
**/*.{ts,tsx}

📄 CodeRabbit inference engine (.cursor/rules/error-logging.mdc)

**/*.{ts,tsx}: 始终使用 try-catch 处理可能的错误
提供有意义的错误信息
记录详细的错误日志
优雅降级处理
日志应包含时间戳、日志级别、错误代码、错误描述、堆栈跟踪(如适用)、相关上下文信息
日志级别应包括 ERROR、WARN、INFO、DEBUG
不要吞掉错误
提供用户友好的错误信息
实现错误重试机制
避免记录敏感信息
使用结构化日志
设置适当的日志级别

Files:

  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/index.ts
  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
src/main/**/*.{ts,js,tsx,jsx}

📄 CodeRabbit inference engine (.cursor/rules/project-structure.mdc)

主进程代码放在 src/main

Files:

  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/index.ts
  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
**/*.{ts,tsx,js,vue}

📄 CodeRabbit inference engine (CLAUDE.md)

Use English for all logs and comments

Files:

  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/index.ts
  • src/renderer/src/components/icons/ModelIcon.vue
  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
**/*.{ts,tsx,vue}

📄 CodeRabbit inference engine (CLAUDE.md)

Enable and adhere to strict TypeScript typing (avoid implicit any, prefer precise types)

Use PascalCase for TypeScript types and classes

Files:

  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/index.ts
  • src/renderer/src/components/icons/ModelIcon.vue
  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
src/main/presenter/configPresenter/providers.ts

📄 CodeRabbit inference engine (CLAUDE.md)

Add provider configuration entries in src/main/presenter/configPresenter/providers.ts

Files:

  • src/main/presenter/configPresenter/providers.ts
src/main/presenter/**/*.ts

📄 CodeRabbit inference engine (AGENTS.md)

Place Electron main-process presenters under src/main/presenter/ (Window, Tab, Thread, Mcp, Config, LLMProvider)

Files:

  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/index.ts
  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
**/*.{ts,tsx,js,jsx,vue,css,scss,md,json,yml,yaml}

📄 CodeRabbit inference engine (AGENTS.md)

Prettier style: single quotes, no semicolons, print width 100; run pnpm run format

Files:

  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/index.ts
  • src/renderer/src/components/icons/ModelIcon.vue
  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
**/*.{ts,tsx,js,jsx,vue}

📄 CodeRabbit inference engine (AGENTS.md)

**/*.{ts,tsx,js,jsx,vue}: Use OxLint for JS/TS code; keep lint clean
Use camelCase for variables and functions
Use SCREAMING_SNAKE_CASE for constants

Files:

  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/index.ts
  • src/renderer/src/components/icons/ModelIcon.vue
  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
src/main/presenter/llmProviderPresenter/index.ts

📄 CodeRabbit inference engine (.cursor/rules/llm-agent-loop.mdc)

src/main/presenter/llmProviderPresenter/index.ts: src/main/presenter/llmProviderPresenter/index.ts should manage the overall Agent loop, conversation history, tool execution via McpPresenter, and frontend communication via eventBus.
The main Agent loop in llmProviderPresenter/index.ts should handle multi-round LLM calls and tool usage, maintaining conversation state and controlling the loop with needContinueConversation and toolCallCount.
The main Agent loop should send standardized STREAM_EVENTS (RESPONSE, END, ERROR) to the frontend via eventBus.
The main Agent loop should buffer text content, handle tool call events, format tool results for the next LLM call, and manage conversation continuation logic.

Files:

  • src/main/presenter/llmProviderPresenter/index.ts
src/renderer/src/**/*

📄 CodeRabbit inference engine (.cursor/rules/i18n.mdc)

src/renderer/src/**/*: All user-facing strings must use i18n keys (avoid hardcoded user-visible text in code)
Use the 'vue-i18n' framework for all internationalization in the renderer
Ensure all user-visible text in the renderer uses the translation system

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/**/*.{vue,ts,js,tsx,jsx}

📄 CodeRabbit inference engine (.cursor/rules/project-structure.mdc)

渲染进程代码放在 src/renderer

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/src/**/*.{vue,ts,tsx,js,jsx}

📄 CodeRabbit inference engine (.cursor/rules/vue-best-practices.mdc)

src/renderer/src/**/*.{vue,ts,tsx,js,jsx}: Use the Composition API for better code organization and reusability
Implement proper state management with Pinia
Utilize Vue Router for navigation and route management
Leverage Vue's built-in reactivity system for efficient data handling

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/src/**/*.vue

📄 CodeRabbit inference engine (.cursor/rules/vue-best-practices.mdc)

Use scoped styles to prevent CSS conflicts between components

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/**/*.{ts,tsx,vue}

📄 CodeRabbit inference engine (.cursor/rules/vue-shadcn.mdc)

src/renderer/**/*.{ts,tsx,vue}: Use descriptive variable names with auxiliary verbs (e.g., isLoading, hasError).
Use TypeScript for all code; prefer types over interfaces.
Avoid enums; use const objects instead.
Use arrow functions for methods and computed properties.
Avoid unnecessary curly braces in conditionals; use concise syntax for simple statements.

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/**/*.{vue,ts}

📄 CodeRabbit inference engine (.cursor/rules/vue-shadcn.mdc)

Implement lazy loading for routes and components.

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/**/*.{ts,vue}

📄 CodeRabbit inference engine (.cursor/rules/vue-shadcn.mdc)

src/renderer/**/*.{ts,vue}: Use useFetch and useAsyncData for data fetching.
Implement SEO best practices using Nuxt's useHead and useSeoMeta.

Use Pinia for frontend state management (do not introduce alternative state libraries)

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/{src,shell,floating}/**/*.vue

📄 CodeRabbit inference engine (CLAUDE.md)

src/renderer/{src,shell,floating}/**/*.vue: Use Vue 3 Composition API for all components
All user-facing strings must use i18n keys via vue-i18n (no hard-coded UI strings)
Use Tailwind CSS utilities and ensure styles are scoped in Vue components

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/src/components/**/*

📄 CodeRabbit inference engine (CLAUDE.md)

Organize UI components by feature within src/renderer/src/

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/src/**

📄 CodeRabbit inference engine (AGENTS.md)

Place Vue 3 app source under src/renderer/src (components, stores, views, i18n, lib)

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/src/**/*.{vue,ts}

📄 CodeRabbit inference engine (AGENTS.md)

All user-facing strings must use vue-i18n ($t/keys) rather than hardcoded literals

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/renderer/**/*.vue

📄 CodeRabbit inference engine (AGENTS.md)

Name Vue component files in PascalCase (e.g., ChatInput.vue)

Files:

  • src/renderer/src/components/icons/ModelIcon.vue
src/main/presenter/llmProviderPresenter/providers/*.ts

📄 CodeRabbit inference engine (.cursor/rules/llm-agent-loop.mdc)

src/main/presenter/llmProviderPresenter/providers/*.ts: Each file in src/main/presenter/llmProviderPresenter/providers/*.ts should handle interaction with a specific LLM API, including request/response formatting, tool definition conversion, native/non-native tool call management, and standardizing output streams to a common event format.
Provider implementations must use a coreStream method that yields standardized stream events to decouple the main loop from provider-specific details.
The coreStream method in each Provider must perform a single streaming API request per conversation round and must not contain multi-round tool call loop logic.
Provider files should implement helper methods such as formatMessages, convertToProviderTools, parseFunctionCalls, and prepareFunctionCallPrompt as needed for provider-specific logic.
All provider implementations must parse provider-specific data chunks and yield standardized events for text, reasoning, tool calls, usage, errors, stop reasons, and image data.
When a provider does not support native function calling, it must prepare messages using prompt wrapping (e.g., prepareFunctionCallPrompt) before making the API call.
When a provider supports native function calling, MCP tools must be converted to the provider's format (e.g., using convertToProviderTools) and included in the API request.
Provider implementations should aggregate and yield usage events as part of the standardized stream.
Provider implementations should yield image data events in the standardized format when applicable.
Provider implementations should yield reasoning events in the standardized format when applicable.
Provider implementations should yield tool call events (tool_call_start, tool_call_chunk, tool_call_end) in the standardized format.
Provider implementations should yield stop events with appropriate stop_reason in the standardized format.
Provider implementations should yield error events in the standardized format...

Files:

  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
🧠 Learnings (9)
📚 Learning: 2025-09-06T03:07:23.817Z
Learnt from: CR
PR: ThinkInAIXYZ/deepchat#0
File: CLAUDE.md:0-0
Timestamp: 2025-09-06T03:07:23.817Z
Learning: Applies to src/main/presenter/configPresenter/providers.ts : Add provider configuration entries in src/main/presenter/configPresenter/providers.ts

Applied to files:

  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/index.ts
📚 Learning: 2025-09-06T03:07:23.817Z
Learnt from: CR
PR: ThinkInAIXYZ/deepchat#0
File: CLAUDE.md:0-0
Timestamp: 2025-09-06T03:07:23.817Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : New LLM providers must be added under src/main/presenter/llmProviderPresenter/providers/ as separate files

Applied to files:

  • src/main/presenter/configPresenter/providers.ts
  • src/main/presenter/llmProviderPresenter/index.ts
  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
📚 Learning: 2025-10-14T08:02:59.495Z
Learnt from: CR
PR: ThinkInAIXYZ/deepchat#0
File: AGENTS.md:0-0
Timestamp: 2025-10-14T08:02:59.495Z
Learning: Applies to src/main/presenter/LLMProvider/**/*.ts : Implement the two-layer LLM provider (Agent Loop + Provider) under src/main/presenter/LLMProvider

Applied to files:

  • src/main/presenter/llmProviderPresenter/index.ts
  • src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts
📚 Learning: 2025-09-04T11:03:30.184Z
Learnt from: CR
PR: ThinkInAIXYZ/deepchat#0
File: .cursor/rules/provider-guidelines.mdc:0-0
Timestamp: 2025-09-04T11:03:30.184Z
Learning: Integrate via the llmProviderPresenter entry point (src/main/presenter/llmProviderPresenter/index.ts) as the related implementation entry

Applied to files:

  • src/main/presenter/llmProviderPresenter/index.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
PR: ThinkInAIXYZ/deepchat#0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/index.ts : `src/main/presenter/llmProviderPresenter/index.ts` should manage the overall Agent loop, conversation history, tool execution via `McpPresenter`, and frontend communication via `eventBus`.

Applied to files:

  • src/main/presenter/llmProviderPresenter/index.ts
📚 Learning: 2025-07-21T01:46:52.880Z
Learnt from: CR
PR: ThinkInAIXYZ/deepchat#0
File: .cursor/rules/llm-agent-loop.mdc:0-0
Timestamp: 2025-07-21T01:46:52.880Z
Learning: Applies to src/main/presenter/llmProviderPresenter/providers/*.ts : Each file in `src/main/presenter/llmProviderPresenter/providers/*.ts` should handle interaction with a specific LLM API, including request/response formatting, tool definition conversion, native/non-native tool call management, and standardizing output streams to a common event format.

Applied to files:

  • src/main/presenter/llmProviderPresenter/index.ts
📚 Learning: 2025-07-23T00:45:57.322Z
Learnt from: CR
PR: ThinkInAIXYZ/deepchat#0
File: .cursor/rules/vue-shadcn.mdc:0-0
Timestamp: 2025-07-23T00:45:57.322Z
Learning: Applies to src/renderer/**/*.{vue} : Import Icon component from Iconify/Vue.

Applied to files:

  • src/renderer/src/components/icons/ModelIcon.vue
📚 Learning: 2025-07-23T00:45:57.322Z
Learnt from: CR
PR: ThinkInAIXYZ/deepchat#0
File: .cursor/rules/vue-shadcn.mdc:0-0
Timestamp: 2025-07-23T00:45:57.322Z
Learning: Applies to src/renderer/**/*.{vue} : Use Iconify/Vue for icon implementation.

Applied to files:

  • src/renderer/src/components/icons/ModelIcon.vue
📚 Learning: 2025-07-23T00:45:57.322Z
Learnt from: CR
PR: ThinkInAIXYZ/deepchat#0
File: .cursor/rules/vue-shadcn.mdc:0-0
Timestamp: 2025-07-23T00:45:57.322Z
Learning: Applies to src/renderer/**/*.{vue} : Use the Icon component with lucide icons.

Applied to files:

  • src/renderer/src/components/icons/ModelIcon.vue
🧬 Code graph analysis (1)
src/main/presenter/llmProviderPresenter/index.ts (1)
src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts (1)
  • ZenmuxProvider (4-16)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: build-check (x64)
🔇 Additional comments (3)
src/main/presenter/llmProviderPresenter/index.ts (1)

52-52: LGTM! Provider registration follows standard pattern.

The ZenMux provider is correctly registered in both the id-based and apiType-based instantiation paths, following the same pattern as other providers like JiekouProvider.

Also applies to: 224-225, 282-283

src/main/presenter/llmProviderPresenter/providers/zenmuxProvider.ts (1)

9-15: LGTM! Model grouping implementation is correct.

The override of fetchOpenAIModels to annotate models with the 'ZenMux' group follows the standard pattern and will properly group ZenMux models in the UI.

src/renderer/src/components/icons/ModelIcon.vue (1)

66-66: Icon file verified.

The ZenMux icon file exists at the expected location: src/renderer/src/assets/llm-icons/zenmux-color.svg. The import statement and mapping follow the correct pattern. No issues found.

Comment on lines +112 to +126
{
id: 'zenmux',
name: 'ZenMux',
apiType: 'zenmux',
apiKey: '',
baseUrl: 'https://zenmux.ai/api/v1/',
enable: false,
websites: {
official: 'https://zenmux.ai/',
apiKey: 'https://zenmux.ai/settings/keys',
docs: 'https://docs.zenmux.ai/api/openai/create-chat-completion.html',
models: 'https://docs.zenmux.ai/api/openai/create-chat-completion.html',
defaultBaseUrl: 'https://zenmux.ai/api/v1/'
}
},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

Verify ZenMux URLs are accessible and correct.

The provider configuration looks well-structured. However, please verify that all URLs (official, apiKey, docs, models, defaultBaseUrl) are accessible and point to the correct resources.

Run the following script to check the URL availability:


🏁 Script executed:

#!/bin/bash
# Description: Verify ZenMux URLs are accessible

echo "Checking ZenMux URLs..."
curl -s -o /dev/null -w "Official: %{http_code}\n" https://zenmux.ai/
curl -s -o /dev/null -w "API Keys: %{http_code}\n" https://zenmux.ai/settings/keys
curl -s -o /dev/null -w "API Docs: %{http_code}\n" https://docs.zenmux.ai/api/openai/create-chat-completion.html
curl -s -o /dev/null -w "Base URL: %{http_code}\n" https://zenmux.ai/api/v1/

Length of output: 79


Fix baseUrl endpoint that returns 404 error.

The API endpoint https://zenmux.ai/api/v1/ (used for both baseUrl and defaultBaseUrl) is unreachable and returns a 404 error. While the official website and documentation pages are accessible, the actual API endpoint path appears to be incorrect. Verify the correct ZenMux API endpoint from their official documentation and update the configuration accordingly.

🤖 Prompt for AI Agents
In src/main/presenter/configPresenter/providers.ts around lines 112 to 126, the
configured ZenMux API paths are incorrect and return 404; verify the official
ZenMux API root in their documentation and update both the provider's baseUrl
and websites.defaultBaseUrl to that documented API root (replace the current
'https://zenmux.ai/api/v1/' value with the correct endpoint), ensuring the
provider's baseUrl ends with a trailing slash and keeping websites.apiKey, docs,
and models URLs unchanged.

@zerob13 zerob13 deleted the codex/add-zenmux-as-new-provider branch November 6, 2025 10:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants