Skip to content

add section under Datadog logging for LLM Observability #11822

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 32 additions & 0 deletions docs/my-website/docs/proxy/logging.md
Original file line number Diff line number Diff line change
Expand Up @@ -1404,6 +1404,32 @@ curl --location 'http://0.0.0.0:4000/chat/completions' \

## DataDog

### LLM Observability

Datadog's [LLM Observability](https://www.datadoghq.com/product/llm-observability/) Python SDK provides autoinstrumentation for LiteLLM. Refer to the LLM Observability [documentation](https://docs.datadoghq.com/llm_observability/setup/auto_instrumentation?tab=python#litellm) for how to set this up.

```python
import os
import litellm
from litellm import completion
from ddtrace.llmobs import LLMObs

LLMObs.enable(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is the benefit of this approach over the existing callback?

This flow also adds a new dependency on the user.

ml_app="my-test-app",
)

litellm.api_key = os.environ["ANTHROPIC_API_KEY"]

messages = [{ "content": "What color is the sky?","role": "user"}]
response = completion(model="claude-3-5-sonnet-20240620", messages=messages, stream=False)
```

Run the example script above with ``ddtrace-run python example.py``. You should then see traces appear in Datadog's LLM Observability product.

<Image img={require('../../img/dd_llm_observability.png')} />

### Other Integrations

LiteLLM Supports logging to the following Datdog Integrations:
- `datadog` [Datadog Logs](https://docs.datadoghq.com/logs/)
- `datadog_llm_observability` [Datadog LLM Observability](https://www.datadoghq.com/product/llm-observability/)
Expand All @@ -1429,6 +1455,12 @@ litellm_settings:
</TabItem>
<TabItem value="datadog_llm_observability" label="Datadog LLM Observability">

:::warning

It is recommended to use Datadog's in-house instrumentation for LLM Observability. See [here](#llm-observability) for more details.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is there a warning here?


:::

```yaml
model_list:
- model_name: gpt-3.5-turbo
Expand Down
Binary file added docs/my-website/img/dd_llm_observability.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading