ollama-openai-proxy is a proxy server that emulates the REST API of ollama/ollama. Requests are forwarded to OpenAI.
The connection to OpenAI is made via a third-party sashabaranov/go-openai.
This allows you to use OpenAI models via a proxy from Jetbrains AI Assistant.
# Create .env file and fill OPENAI_API_KEY variable.
cp -p .env.local .env
docker compose up --build -d- Open Settings.
- Move Settings > Tools > AI Assistant > Models.
- Check
Enable Ollama. Check whether theTest Connectionis successful. - Select the model you want to use under
Core featuresandInstant helpers. - Check
Offline mode. - Press
Apply, thenOK. - When you open
AI Chat, you will be able to select a model fromOllama.