Skip to content

Commit 8892cee

Browse files
committed
minor #607 [Examples] Consistency about Ollama model config via env, and wrap up docs (chr-hertel)
This PR was merged into the main branch. Discussion ---------- [Examples] Consistency about Ollama model config via env, and wrap up docs | Q | A | ------------- | --- | Bug fix? | no | New feature? | no | Docs? | yes | Issues | | License | MIT Following #562 and #563 Commits ------- 5292d15 Consistency about Ollama model config via env, and wrap up docs.
2 parents 4e81587 + 5292d15 commit 8892cee

File tree

9 files changed

+23
-16
lines changed

9 files changed

+23
-16
lines changed

examples/.env

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,8 +16,9 @@ VOYAGE_API_KEY=
1616
REPLICATE_API_KEY=
1717

1818
# For using Ollama
19-
OLLAMA_HOST_URL=
20-
OLLAMA_MODEL=
19+
OLLAMA_HOST_URL=http://localhost:11434
20+
OLLAMA_LLM=llama3.2
21+
OLLAMA_EMBEDDINGS=nomic-embed-text
2122

2223
# For using GPT on Azure
2324
AZURE_OPENAI_BASEURL=

examples/ollama/Ollama.md renamed to examples/ollama/README.md

Lines changed: 12 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -11,20 +11,26 @@ To get started with Ollama please check their [Quickstart guide](https://github.
1111
To run the examples you will need to download [Llama 3.2](https://ollama.com/library/llama3.2)
1212
and [nomic-embed-text](https://ollama.com/library/nomic-embed-text) models.
1313

14-
Once models are downloaded you can run them with
14+
You can do this by running the following commands:
1515
```bash
16-
ollama run <model-name>
16+
ollama pull llama3.2
17+
ollama pull nomic-embed-text
1718
```
18-
for example
1919

20+
Then you can start the Ollama server by running:
2021
```bash
21-
ollama run llama3.2
22+
ollama serve
2223
```
2324

24-
#### Configuration
25-
To run Ollama examples you will need to provide a OLLAMA_HOST_URL key in your env.local file.
25+
### Configuration
26+
By default, the examples expect Ollama to be run on `localhost:11434`, but you can customize this in your `.env.local`
27+
file - as well as the models to be used:
2628

2729
For example
2830
```bash
2931
OLLAMA_HOST_URL=http://localhost:11434
32+
OLLAMA_LLM=llama3.2
33+
OLLAMA_EMBEDDINGS=nomic-embed-text
3034
```
35+
36+
You can find more models in the [Ollama model library](https://ollama.com/library).

examples/ollama/chat-llama.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
require_once dirname(__DIR__).'/bootstrap.php';
1818

1919
$platform = PlatformFactory::create(env('OLLAMA_HOST_URL'), http_client());
20-
$model = new Ollama($_SERVER['OLLAMA_MODEL'] ?? '');
20+
$model = new Ollama(env('OLLAMA_LLM'));
2121

2222
$messages = new MessageBag(
2323
Message::forSystem('You are a helpful assistant.'),

examples/ollama/embeddings.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616

1717
$platform = PlatformFactory::create(env('OLLAMA_HOST_URL'), http_client());
1818

19-
$response = $platform->invoke(new Ollama(Ollama::NOMIC_EMBED_TEXT), <<<TEXT
19+
$response = $platform->invoke(new Ollama(env('OLLAMA_EMBEDDINGS')), <<<TEXT
2020
Once upon a time, there was a country called Japan. It was a beautiful country with a lot of mountains and rivers.
2121
The people of Japan were very kind and hardworking. They loved their country very much and took care of it. The
2222
country was very peaceful and prosperous. The people lived happily ever after.

examples/ollama/indexer.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@
2222

2323
$platform = PlatformFactory::create(env('OLLAMA_HOST_URL'), http_client());
2424
$store = new InMemoryStore();
25-
$vectorizer = new Vectorizer($platform, $embeddings = new Ollama(Ollama::NOMIC_EMBED_TEXT), logger());;
25+
$vectorizer = new Vectorizer($platform, $embeddings = new Ollama(env('OLLAMA_EMBEDDINGS')), logger());;
2626
$indexer = new Indexer(
2727
loader: new TextFileLoader(),
2828
vectorizer: $vectorizer,

examples/ollama/rag.php

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -43,11 +43,11 @@
4343

4444
// create embeddings for documents
4545
$platform = PlatformFactory::create(env('OLLAMA_HOST_URL'), http_client());
46-
$vectorizer = new Vectorizer($platform, $embeddings = new Ollama(Ollama::NOMIC_EMBED_TEXT), logger());
46+
$vectorizer = new Vectorizer($platform, $embeddings = new Ollama(env('OLLAMA_EMBEDDINGS')), logger());
4747
$indexer = new Indexer(new InMemoryLoader($documents), $vectorizer, $store, logger: logger());
4848
$indexer->index($documents);
4949

50-
$model = new Ollama();
50+
$model = new Ollama(env('OLLAMA_LLM'));
5151

5252
$similaritySearch = new SimilaritySearch($vectorizer, $store);
5353
$toolbox = new Toolbox([$similaritySearch], logger: logger());

examples/ollama/stream.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
require_once dirname(__DIR__).'/bootstrap.php';
1818

1919
$platform = PlatformFactory::create(env('OLLAMA_HOST_URL'), http_client());
20-
$model = new Ollama(Ollama::LLAMA_3_2);
20+
$model = new Ollama(env('OLLAMA_LLM'));
2121

2222
$messages = new MessageBag(
2323
Message::forSystem('You are a helpful assistant.'),

examples/ollama/structured-output-math.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@
2020
require_once dirname(__DIR__).'/bootstrap.php';
2121

2222
$platform = PlatformFactory::create(env('OLLAMA_HOST_URL'), http_client());
23-
$model = new Ollama(Ollama::LLAMA_3_2);
23+
$model = new Ollama(env('OLLAMA_LLM'));
2424

2525
$processor = new AgentProcessor();
2626
$agent = new Agent($platform, $model, [$processor], [$processor], logger: logger());

examples/ollama/toolcall.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@
2121
require_once dirname(__DIR__).'/bootstrap.php';
2222

2323
$platform = PlatformFactory::create(env('OLLAMA_HOST_URL'), http_client());
24-
$model = new Ollama(Ollama::LLAMA_3_2);
24+
$model = new Ollama(env('OLLAMA_LLM'));
2525

2626
$toolbox = new Toolbox([new Clock()], logger: logger());
2727
$processor = new AgentProcessor($toolbox);

0 commit comments

Comments
 (0)