Skip to content

OpenTelemetry span attributes still truncated despite environment variables #10328

@lukasugar

Description

@lukasugar

Description

When using the AI SDK's experimental_telemetry feature with OpenTelemetry, span attributes containing LLM messages and responses are being truncated despite setting all recommended OpenTelemetry environment variables to very high values (1MB+).

The truncation appears to be happening somewhere in the AI SDK's telemetry recording or the underlying OpenTelemetry integration, ignoring the configured attribute value length limits.

The truncation happens at 1024 characters.
Our prompts are longer, and so we loose a lot of information in the logs.

Environment Variables Set

OTEL_ATTRIBUTE_VALUE_LENGTH_LIMIT=1048576
TRIGGER_OTEL_SPAN_ATTRIBUTE_VALUE_LENGTH_LIMIT=1048576
TRIGGER_OTEL_LOG_ATTRIBUTE_VALUE_LENGTH_LIMIT=1048576

All three environment variables are confirmed to be set in the runtime environment (verified via process.env logging).

Code Example

import { generateText } from "ai";
import { anthropic } from "@ai-sdk/anthropic";

const result = await generateText({
  model: anthropic("claude-sonnet-4-5"),
  prompt: "Very long prompt with lots of context...",
  experimental_telemetry: {
    isEnabled: true,
    functionId: "my-function",
    recordInputs: true,
    recordOutputs: true,
    metadata: {
      component: "my-component",
    },
  },
});

Expected Behavior

With OTEL_ATTRIBUTE_VALUE_LENGTH_LIMIT=1048576 (1MB), span attributes should contain the full:

  • Input prompt (up to 1MB)
  • Output text (up to 1MB)
  • Tool call arguments and results (up to 1MB)

Actual Behavior

Span attributes are still being truncated at what appears to be a much lower limit (possibly still at the default 1024 bytes or similar). This makes debugging LLM conversations extremely difficult as we can only see partial messages.

What we tried

We tried changing values for OpenTelemetry environment variables:

OTEL_ATTRIBUTE_VALUE_LENGTH_LIMIT=0

# and bigger:
OTEL_ATTRIBUTE_VALUE_LENGTH_LIMIT=1048576

But that didn't help, it still got truncated.

We tried with a custom tracer, but that also didn't help:
We tried creating a custom NodeTracerProvider with programmatic span limits and passing it via the tracer option in experimental_telemetry:

import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";

const provider = new NodeTracerProvider({
  spanLimits: {
    attributeValueLengthLimit: 0, // unlimited
  },
});
provider.register();

const result = await generateText({
  model,
  prompt,
  experimental_telemetry: {
    isEnabled: true,
    tracer: provider.getTracer("my-tracer"),
  },
});

Any working examples of logging more that 1024 characters per attribute with ai-sdk? Thanks!

AI SDK Version

  • AI SDK Version: ai@6.0.0-beta.99
  • Runtime: Node.js (via Trigger.dev v4)
  • Telemetry Backend: Trigger.dev (OpenTelemetry)
  • Model Provider: Anthropic Claude (@ai-sdk/anthropic@3.0.0-beta.53)

Code of Conduct

  • I agree to follow this project's Code of Conduct

Metadata

Metadata

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions