You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The following LLM providers are currently supported besides OpenAI:
158
+
159
+
- Ollama <https://github.com/ollama/ollama> for local/offline open-source models. The plugin assumes you have the Ollama service up and running with configured models available (the default Ollama agent uses Llama3).
160
+
- GitHub Copilot <https://github.com/settings/copilot> with a Copilot license (zbirenbaum/copilot.lua <https://github.com/zbirenbaum/copilot.lua> or github/copilot.vim <https://github.com/github/copilot.vim> for autocomplete). You can access the underlying GPT-4 model without paying anything extra (essentially unlimited GPT-4 access).
161
+
- Perplexity.ai <https://www.perplexity.ai/pro> Pro users have $5/month free API credits available (the default PPLX agent uses Mixtral-8x7b).
162
+
- Anthropic <https://www.anthropic.com/api> to access Claude models, which currently outperform GPT-4 in some benchmarks.
163
+
- Google Gemini <https://ai.google.dev/> with a quite generous free range but some geo-restrictions (EU).
164
+
- Any other "OpenAI chat/completions" compatible endpoint (Azure, LM Studio, etc.)
165
+
166
+
Below is an example of the relevant configuration part enabling some of these.
167
+
The `secret` field has the same capabilities as `openai_api_key` (which is
0 commit comments