-
-
Notifications
You must be signed in to change notification settings - Fork 2
Description
Summary
Codingbuddy serves as a multi-AI rules MCP server for all AI assistants (Cursor, Claude Code, GitHub Copilot/Codex, Antigravity, Amazon Q, Kiro), but the model configuration system is currently hard-coded for Anthropic Claude only. This creates a poor experience for users of other AI providers.
Problem
1. model.constants.ts — Claude-only constants
File: apps/mcp-server/src/model/model.constants.ts
Currently only defines Claude models:
```ts
export const CLAUDE_OPUS_4 = 'claude-opus-4-20250514';
export const CLAUDE_SONNET_4 = 'claude-sonnet-4-20250514';
export const CLAUDE_HAIKU_35 = 'claude-haiku-3-5-20241022';
export const DEFAULT_MODEL = CLAUDE_SONNET_4;
```
No constants exist for OpenAI (gpt-4o, o3-mini, o1-mini) or Google (gemini-2.5-pro, gemini-2.5-flash) models.
2. model.resolver.ts — False "Unknown model" warnings for GPT/Gemini
File: apps/mcp-server/src/model/model.resolver.ts
```ts
export const KNOWN_MODEL_PREFIXES = [
'claude-opus-4',
'claude-sonnet-4',
'claude-sonnet-3',
'claude-haiku-3',
] as const;
```
When a user sets ai.defaultModel: "gpt-4o" in codingbuddy.config.json, resolveModel() returns a warning:
Unknown model ID: "gpt-4o". Known prefixes: claude-opus-4, claude-sonnet-4, ...
The model still works, but the warning is misleading for valid OpenAI/Google models.
3. model-prompt.ts — CLI init only offers Claude choices
File: apps/mcp-server/src/cli/init/prompts/model-prompt.ts
```ts
export function getModelChoices(): ModelChoice[] {
return [
{ name: 'Claude Sonnet 4 (Recommended)', value: CLAUDE_SONNET_4, ... },
{ name: 'Claude Opus 4', value: CLAUDE_OPUS_4, ... },
{ name: 'Claude Haiku 3.5 (Not recommended)', value: CLAUDE_HAIKU_35, ... },
];
}
```
During codingbuddy init, Codex/GPT/Gemini users see only Claude options with no way to select their preferred model interactively.
Proposed Solution
Step 1 — Expand model.constants.ts with multi-provider constants
Add constants for OpenAI and Google models alongside the existing Claude constants:
```ts
// Existing Claude constants (keep as-is)
export const CLAUDE_OPUS_4 = 'claude-opus-4-20250514';
export const CLAUDE_SONNET_4 = 'claude-sonnet-4-20250514';
export const CLAUDE_HAIKU_35 = 'claude-haiku-3-5-20241022';
// OpenAI models
export const GPT_4O = 'gpt-4o';
export const GPT_4O_MINI = 'gpt-4o-mini';
export const O3_MINI = 'o3-mini';
export const O1_MINI = 'o1-mini';
// Google models
export const GEMINI_25_PRO = 'gemini-2.5-pro';
export const GEMINI_25_FLASH = 'gemini-2.5-flash';
// Default remains Claude Sonnet (backward compatible)
export const DEFAULT_MODEL = CLAUDE_SONNET_4;
```
Step 2 — Expand KNOWN_MODEL_PREFIXES in model.resolver.ts
```ts
export const KNOWN_MODEL_PREFIXES = [
// Anthropic Claude
'claude-opus-4',
'claude-sonnet-4',
'claude-sonnet-3',
'claude-haiku-3',
// OpenAI
'gpt-4',
'gpt-3.5',
'o1-',
'o3-',
// Google
'gemini-',
] as const;
```
Also remove the Haiku-specific deprecation logic (currently tied to Claude), or make it provider-agnostic.
Step 3 — Update model-prompt.ts CLI with multi-provider choices
Group choices by provider so Codex/GPT users can select their model during codingbuddy init:
```ts
export function getModelChoices(): ModelChoice[] {
return [
// Anthropic (default)
{ name: 'Claude Sonnet 4 (Recommended)', value: CLAUDE_SONNET_4, description: 'Balanced performance and cost · Anthropic' },
{ name: 'Claude Opus 4', value: CLAUDE_OPUS_4, description: 'Most capable · Anthropic' },
// OpenAI
{ name: 'GPT-4o', value: GPT_4O, description: 'Flagship multimodal model · OpenAI' },
{ name: 'GPT-4o mini', value: GPT_4O_MINI, description: 'Fast and affordable · OpenAI' },
{ name: 'o3-mini', value: O3_MINI, description: 'Reasoning model · OpenAI' },
// Google
{ name: 'Gemini 2.5 Pro', value: GEMINI_25_PRO, description: 'Advanced reasoning · Google' },
{ name: 'Gemini 2.5 Flash', value: GEMINI_25_FLASH, description: 'Fast and efficient · Google' },
// Escape hatch
{ name: 'Other (enter manually)', value: 'custom', description: 'Any model ID not listed above' },
];
}
```
When __custom__ is selected, follow up with an input prompt to collect a free-text model ID.
Files to Change
| File | Change |
|---|---|
apps/mcp-server/src/model/model.constants.ts |
Add OpenAI and Google model constants |
apps/mcp-server/src/model/model.constants.spec.ts |
Add tests for new constants |
apps/mcp-server/src/model/model.resolver.ts |
Expand KNOWN_MODEL_PREFIXES |
apps/mcp-server/src/model/model.resolver.spec.ts |
Add tests for GPT/Gemini prefix recognition |
apps/mcp-server/src/cli/init/prompts/model-prompt.ts |
Multi-provider choices + __custom__ handler |
apps/mcp-server/src/cli/init/prompts/model-prompt.spec.ts |
Update/add tests for new choices |
Acceptance Criteria
model.constants.ts
-
GPT_4O,GPT_4O_MINI,O3_MINI,O1_MINIconstants are exported -
GEMINI_25_PRO,GEMINI_25_FLASHconstants are exported -
DEFAULT_MODELremainsclaude-sonnet-4-20250514(backward compatible)
model.resolver.ts
-
isKnownModel('gpt-4o')returnstrue -
isKnownModel('gpt-4o-mini')returnstrue -
isKnownModel('o3-mini')returnstrue -
isKnownModel('o1-mini')returnstrue -
isKnownModel('gemini-2.5-pro')returnstrue -
isKnownModel('gemini-2.5-flash')returnstrue -
resolveModel({ globalDefaultModel: 'gpt-4o' })returns{ model: 'gpt-4o', source: 'global' }with no warning -
resolveModel({ globalDefaultModel: 'truly-unknown-model' })still returns a warning
model-prompt.ts
-
getModelChoices()returns at least 7 choices (3 Anthropic + 2 OpenAI minimum + custom) - At least one choice has
valuestarting withgpt- - At least one choice has
valuestarting withgemini- - A
__custom__escape-hatch choice exists -
promptModelSelection()handles__custom__by triggering a free-textinputprompt -
DEFAULT_MODEL_CHOICEstill defaults toclaude-sonnet-4-20250514
Constraints
- No breaking changes:
DEFAULT_MODELstays as Claude Sonnet 4. Existing Claude users are unaffected. - No new dependencies: Use existing
@inquirer/prompts(select+input) for the custom model prompt. - TDD: Write failing tests first (RED), then implement (GREEN), then refactor.
- Test coverage: Maintain ≥ 90% coverage on all changed files.
Context
This is a pure model-system change. It does not affect:
- The MCP protocol handlers
- Agent definitions in
.ai-rules/agents/ - The config schema (users already set
ai.defaultModelfreely; this just removes false warnings and improves CLI UX) crush.json/opencode.json(personal dev config files, out of scope)
The resolveModel() function already supports unknown models gracefully (passes them through with a warning). This issue simply expands the "known" list and improves the CLI selection experience.