Skip to content

A quick tool to generate an OpenCode provider configuration file by querying OpenAI-compatible API endpoints

License

Notifications You must be signed in to change notification settings

torquemad/opencode-provider-config

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

opencode-provider-config

What it is

A quick tool to generate an OpenCode provider configuration file by querying an OpenAI-compatible API endpoint.

  • Queries the /models endpoint of any OpenAI-compatible API (e.g., LiteLLM, vLLM, OpenRouter)
  • Discovers all available models automatically
  • Generates a valid OpenCode opencode.json provider config

What it isnt

At this stage, it isn't much more than what it is

Installation

Download the latest binary from Releases for MacOS (Apple Silicon, Intel), Windows and Linux:

e.g.

# Linux
curl -LO https://github.com/torquemad/opencode-provider-config/releases/latest/download/opencode-provider-config-linux-amd64
chmod +x opencode-provider-config-linux-amd64

Usage

export LITELLM_API_KEY="your-api-key"

# Basic usage
./opencode-provider-config --base-url https://your-llm-proxy.com/v1 output.json

# With custom provider name and key, writing to OpenCode config location
./opencode-provider-config \
  --base-url https://your-llm-proxy.com/v1 \
  --provider-name "My LLM Proxy" \
  --provider-key my-proxy \
  ~/.config/opencode/opencode.json

Or clone this repo and run from source:

go build -o opencode-provider-config .
./opencode-provider-config \
  --base-url https://your-llm-proxy.com/v1 \
  --provider-name "My LLM Proxy" \
  --provider-key my-proxy \
  ~/.config/opencode/opencode.json

Output

The output JSON file can be placed in an appropriate OpenCode config location such as ~/.config/opencode/opencode.json for global user config.

Example output:

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "litellm": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "LiteLLM",
      "options": {
        "baseURL": "https://your-llm-proxy.com/v1"
      },
      "models": {
        "gpt-4o": { "name": "gpt-4o" },
        "claude-3-5-sonnet": { "name": "claude-3-5-sonnet" }
      }
    }
  }
}

Contributing + Releases

Open a PR, if merged, tag a release.

make release VERSION=1.0.0  # explicit version
make release-major          # bump major (1.0.0 → 2.0.0)
make release-minor          # bump minor (1.0.0 → 1.1.0)
make release-patch          # bump patch (1.0.0 → 1.0.1)

Builds binaries for linux/darwin/windows, generates checksums, tags, and creates GitHub Release.

License

Apache 2.0

About

A quick tool to generate an OpenCode provider configuration file by querying OpenAI-compatible API endpoints

Resources

License

Stars

Watchers

Forks

Packages

No packages published