You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Update llm monitoring documentation for JavaScript (#13971)
The `docs/product/insights/ai/llm-monitoring/getting-started/index.mdx`
page was updated to document JavaScript SDK support for LLM monitoring
via the Vercel AI integration.
Key changes include:
* The previous alert stating "This feature is currently only available
in the Python SDK" was removed.
* Content was restructured with new "Supported SDKs" sections for Python
and JavaScript.
* A dedicated JavaScript section was added, detailing support for the
Vercel AI integration.
* A comprehensive list of supported JavaScript platforms was included,
using `LinkWithPlatformIcon` components, such as Node.js, Next.js,
SvelteKit, Nuxt, and various server/serverless frameworks.
* An alert was added to clarify that the Vercel AI integration is
limited to Node.js and Bun runtimes and requires SDK version `8.43.0` or
higher.
These updates provide clear guidance for JavaScript users and address
previous confusion regarding LLM monitoring capabilities.
---------
Co-authored-by: Cursor Agent <cursoragent@cursor.com>
We'll be adding AI integrations continuously. You can also instrument AI manually with the Sentry Python SDK.
31
152
32
153
</Alert>
33
154
34
-
35
155
## Pipelines and LLMs
36
156
37
157
The Sentry LLM Monitoring feature relies on the fact that you have an orchestrator (like LangChain) creating pipelines of one or more LLMs (such as gpt-4). In the LLM Monitoring dashboard, we show you a table of the AI pipelines and pull the token usage from your LLMs.
0 commit comments