Skip to content

Commit 2ee9e9c

Browse files
Copilotaaronpowell
andcommitted
Update documentation for OpenTelemetry configuration support
Co-authored-by: aaronpowell <434140+aaronpowell@users.noreply.github.com>
1 parent 68da4b9 commit 2ee9e9c

File tree

1 file changed

+11
-0
lines changed
  • src/CommunityToolkit.Aspire.OllamaSharp

1 file changed

+11
-0
lines changed

src/CommunityToolkit.Aspire.OllamaSharp/README.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,17 @@ public class MyService(IOllamaApiClient ollamaApiClient)
3737

3838
To use the integration with Microsoft.Extensions.AI, call the `AddOllamaSharpChatClient` or `AddOllamaSharpEmbeddingGenerator` extension method in the _Program.cs_ file of your project. These methods take the connection name as a parameter, just as `AddOllamaApiClient` does, and will register the `IOllamaApiClient`, as well as the `IChatClient` or `IEmbeddingGenerator` in the DI container. The `IEmbeddingsGenerator` is registered with the generic arguments of `<string, Embedding<float>>`.
3939

40+
#### Configuring OpenTelemetry
41+
42+
When using the chat client integration, you can optionally configure the OpenTelemetry chat client to control telemetry behavior such as enabling sensitive data:
43+
44+
```csharp
45+
builder.AddOllamaApiClient("ollama")
46+
.AddChatClient(otel => otel.EnableSensitiveData = true);
47+
```
48+
49+
The integration automatically registers the Microsoft.Extensions.AI telemetry source (`Experimental.Microsoft.Extensions.AI`) with OpenTelemetry for distributed tracing.
50+
4051
## Additional documentation
4152

4253
- https://github.com/awaescher/OllamaSharp

0 commit comments

Comments
 (0)