Skip to content

Update llm monitoring documentation for JavaScript #13971

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
152 changes: 136 additions & 16 deletions docs/product/insights/ai/llm-monitoring/getting-started/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,34 +4,154 @@ sidebar_order: 0
description: "Learn how to set up Sentry LLM Monitoring"
---

<Alert>

This feature is currently only available in the Python SDK.

</Alert>

Sentry LLM Monitoring is easiest to use with the Python SDK and an official integration like OpenAI.
Sentry LLM Monitoring helps you track and debug AI-powered applications using our supported SDKs and integrations.

![LLM Monitoring User Interface](../img/pipelines-view.png)


To start sending LLM data to Sentry, make sure you've created a Sentry project for your AI-enabled repository and follow one of the guides below:

## Official AI Integrations

- [OpenAI](/platforms/python/integrations/openai/)
- [Langchain](/platforms/python/integrations/langchain/)
- [Anthropic](/platforms/python/integrations/anthropic/)
- [Huggingface Hub](/platforms/python/integrations/huggingface_hub/)
- [Cohere](/platforms/python/integrations/cohere/)
## Supported SDKs

### Python

The Sentry Python SDK supports LLM monitoring with integrations for OpenAI, Langchain, Anthropic, Huggingface, and Cohere

#### Official AI Integrations

- <LinkWithPlatformIcon
platform="openai"
label="OpenAI"
url="/platforms/python/integrations/openai/"
/>
- <LinkWithPlatformIcon
platform="langchain"
label="Langchain"
url="/platforms/python/integrations/langchain/"
/>
- <LinkWithPlatformIcon
platform="anthropic"
label="Anthropic"
url="/platforms/python/integrations/anthropic/"
/>
- <LinkWithPlatformIcon
platform="huggingface"
label="Huggingface Hub"
url="/platforms/python/integrations/huggingface_hub/"
/>
- <LinkWithPlatformIcon
platform="python"
label="Cohere"
url="/platforms/python/integrations/cohere/"
/>

### JavaScript

The JavaScript SDK supports LLM monitoring through the Vercel AI integration for Node.js and Bun runtimes.

#### Supported Platforms

- <LinkWithPlatformIcon
platform="javascript.node"
label="Node.js"
url="/platforms/javascript/guides/node/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.nextjs"
label="Next.js"
url="/platforms/javascript/guides/nextjs/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.sveltekit"
label="SvelteKit"
url="/platforms/javascript/guides/sveltekit/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.nuxt"
label="Nuxt"
url="/platforms/javascript/guides/nuxt/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.astro"
label="Astro"
url="/platforms/javascript/guides/astro/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.remix"
label="Remix"
url="/platforms/javascript/guides/remix/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.solidstart"
label="SolidStart"
url="/platforms/javascript/guides/solidstart/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.express"
label="Express"
url="/platforms/javascript/guides/express/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.fastify"
label="Fastify"
url="/platforms/javascript/guides/fastify/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.nestjs"
label="Nest.js"
url="/platforms/javascript/guides/nestjs/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.hapi"
label="Hapi"
url="/platforms/javascript/guides/hapi/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.koa"
label="Koa"
url="/platforms/javascript/guides/koa/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.connect"
label="Connect"
url="/platforms/javascript/guides/connect/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.hono"
label="Hono"
url="/platforms/javascript/guides/hono/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.bun"
label="Bun"
url="/platforms/javascript/guides/bun/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.aws-lambda"
label="AWS Lambda"
url="/platforms/javascript/guides/aws-lambda/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.azure-functions"
label="Azure Functions"
url="/platforms/javascript/guides/azure-functions/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.gcp-functions"
label="Google Cloud Functions"
url="/platforms/javascript/guides/gcp-functions/configuration/integrations/vercelai/"
/>
- <LinkWithPlatformIcon
platform="javascript.electron"
label="Electron"
url="/platforms/javascript/guides/electron/configuration/integrations/vercelai/"
/>

<Alert title="Don't see your platform?">

We'll be adding AI integrations continuously. You can also instrument AI manually with the Sentry Python SDK.

</Alert>


## Pipelines and LLMs

The Sentry LLM Monitoring feature relies on the fact that you have an orchestrator (like LangChain) creating pipelines of one or more LLMs (such as gpt-4). In the LLM Monitoring dashboard, we show you a table of the AI pipelines and pull the token usage from your LLMs.
Expand Down