Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions src/CommunityToolkit.Aspire.OllamaSharp/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# CommunityToolkit.Aspire.OllamaSharp library

Registers `IOllamaClientApi` in the DI container to interact with the [Ollama](https://ollama.com) API and optionally supports registering an `IChatClient` or `IEmbeddingGenerator` from [Microsoft.Extensions.AI](https://devblogs.microsoft.com/dotnet/introducing-microsoft-extensions-ai-preview/).
Registers `IOllamaApiClient` in the DI container to interact with the [Ollama](https://ollama.com) API and optionally supports registering an `IChatClient` or `IEmbeddingGenerator` from [Microsoft.Extensions.AI](https://devblogs.microsoft.com/dotnet/introducing-microsoft-extensions-ai-preview/).

## Getting Started

Expand All @@ -18,24 +18,24 @@ dotnet add package CommunityToolkit.Aspire.OllamaSharp

### Example usage

In the _Program.cs_ file of your project, call the `AddOllamaClientApi` extension method to register the `IOllamaClientApi` in the DI container. This method takes the connection name as a parameter:
In the _Program.cs_ file of your project, call the `AddOllamaApiClient` extension method to register the `IOllamaApiClient` in the DI container. This method takes the connection name as a parameter:

```csharp
builder.AddOllamaClientApi("ollama");
builder.AddOllamaApiClient("ollama");
```

Then, in your service, inject `IOllamaClientApi` and use it to interact with the Ollama API:
Then, in your service, inject `IOllamaApiClient` and use it to interact with the Ollama API:

```csharp
public class MyService(IOllamaClientApi ollamaClientApi)
public class MyService(IOllamaApiClient ollamaApiClient)
{
// ...
}
```

#### Integration with Microsoft.Extensions.AI

To use the integration with Microsoft.Extensions.AI, call the `AddOllamaSharpChatClient` or `AddOllamaSharpEmbeddingGenerator` extension method in the _Program.cs_ file of your project. These methods take the connection name as a parameter, just as `AddOllamaClientApi` does, and will register the `IOllamaApiClient`, as well as the `IChatClient` or `IEmbeddingGenerator` in the DI container. The `IEmbeddingsGenerator` is registered with the generic arguments of `<string, Embedding<float>>`.
To use the integration with Microsoft.Extensions.AI, call the `AddOllamaSharpChatClient` or `AddOllamaSharpEmbeddingGenerator` extension method in the _Program.cs_ file of your project. These methods take the connection name as a parameter, just as `AddOllamaApiClient` does, and will register the `IOllamaApiClient`, as well as the `IChatClient` or `IEmbeddingGenerator` in the DI container. The `IEmbeddingsGenerator` is registered with the generic arguments of `<string, Embedding<float>>`.

## Additional documentation

Expand Down
Loading