Skip to content

Update groq.ts #1776

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Aug 6, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .changeset/moody-dodos-rhyme.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
"kilo-code": patch
---

add GPT-OSS 120b and 20b models to Groq provider
20 changes: 20 additions & 0 deletions packages/types/src/providers/groq.ts
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@ export type GroqModelId =
| "qwen/qwen3-32b"
| "deepseek-r1-distill-llama-70b"
| "moonshotai/kimi-k2-instruct"
| "openai/gpt-oss-120b"
| "openai/gpt-oss-20b"

export const groqDefaultModelId: GroqModelId = "llama-3.3-70b-versatile" // Defaulting to Llama3 70B Versatile

Expand Down Expand Up @@ -97,4 +99,22 @@ export const groqModels = {
outputPrice: 3.0,
description: "Moonshot AI Kimi K2 Instruct 1T model, 128K context.",
},
"openai/gpt-oss-120b": {
maxTokens: 32768,
contextWindow: 131072,
supportsImages: false,
supportsPromptCache: false,
inputPrice: 0.15,
outputPrice: 0.75,
description: "GPT-OSS 120B is OpenAI's flagship open source model, built on a Mixture-of-Experts (MoE) architecture with 20 billion parameters and 128 experts.",
},
"openai/gpt-oss-20b": {
maxTokens: 32768,
contextWindow: 131072,
supportsImages: false,
supportsPromptCache: false,
inputPrice: 0.10,
outputPrice: 0.50,
description: "GPT-OSS 20B is OpenAI's flagship open source model, built on a Mixture-of-Experts (MoE) architecture with 20 billion parameters and 32 experts.",
}
} as const satisfies Record<string, ModelInfo>
Loading