Skip to content

Commit

Permalink
docs: ollama as self-hosted llm
Browse files Browse the repository at this point in the history
Use ollama/ollama as an example of
a self-hosted LLM.
  • Loading branch information
Realiserad committed Feb 10, 2024
1 parent af603b1 commit 87a37ae
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,16 +14,16 @@ If you like it, please add a ⭐.

1. Create a configuration file called `.config/fish-ai.ini`.

If you use [a self-hosted LLM](https://github.com/getumbrel/llama-gpt), e.g. `code-llama-13b-chat.gguf`:
If you use [a self-hosted LLM](https://github.com/ollama/ollama), e.g. [`codellama`](https://ollama.com/library/codellama):

```ini
[fish-ai]
configuraton = self-hosted

[self-hosted]
provider = self-hosted
server = https://your-server/v1
model = code-llama-13b-chat.gguf
server = http://localhost:11434/v1
model = codellama
```

If you use [OpenAI](https://platform.openai.com/login):
Expand Down

0 comments on commit 87a37ae

Please sign in to comment.