Skip to content

Conversation

@andreiborza
Copy link
Member

Currently, when using Sentry alongside a custom OpenTelemetry setup, any spans started via our Sentry.startSpanX apis leak into the OpenTelemetry setup even if tracing is disabled.

This fix suppresses tracing for span creation via our startSpanX apis but ensures tracing is not suppressed within the callback so that for example custom OTel spans created within Sentry.startSpanX calls are not suppresed.

I update the node-otel-without-tracing e2e tests to reflect that no Sentry spans leak into the OTLP endpoint, as well as trying this out locally with an express app and Jaeger.

Before the fix, Sentry span leaks:
Screenshot 2025-11-24 at 20 35 21
Screenshot 2025-11-24 at 20 34 53


After the fix, no Sentry span leakage
Screenshot 2025-11-24 at 20 28 22
Screenshot 2025-11-24 at 20 28 32

Closes: #17826

…sabled

Currently, when using Sentry alongside a custom OpenTelemetry setup, any spans
started via our Sentry.startSpanX apis leak into the OpenTelemetry setup even if
tracing is disabled.

This fix suppresses tracing for span creation via our startSpanX apis but
ensures tracing is not suppressed within the callback so that for example custom
OTel spans created within Sentry.startSpanX calls are not suppresed.

Closes: #17826
@github-actions
Copy link
Contributor

github-actions bot commented Nov 26, 2025

size-limit report 📦

Path Size % Change Change
@sentry/browser 24.8 kB - -
@sentry/browser - with treeshaking flags 23.31 kB - -
@sentry/browser (incl. Tracing) 41.54 kB - -
@sentry/browser (incl. Tracing, Profiling) 46.13 kB - -
@sentry/browser (incl. Tracing, Replay) 79.96 kB - -
@sentry/browser (incl. Tracing, Replay) - with treeshaking flags 69.69 kB - -
@sentry/browser (incl. Tracing, Replay with Canvas) 84.64 kB - -
@sentry/browser (incl. Tracing, Replay, Feedback) 96.88 kB - -
@sentry/browser (incl. Feedback) 41.48 kB - -
@sentry/browser (incl. sendFeedback) 29.49 kB - -
@sentry/browser (incl. FeedbackAsync) 34.47 kB - -
@sentry/react 26.52 kB - -
@sentry/react (incl. Tracing) 43.74 kB - -
@sentry/vue 29.25 kB - -
@sentry/vue (incl. Tracing) 43.34 kB - -
@sentry/svelte 24.82 kB - -
CDN Bundle 27.21 kB - -
CDN Bundle (incl. Tracing) 42.21 kB - -
CDN Bundle (incl. Tracing, Replay) 78.75 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback) 84.2 kB - -
CDN Bundle - uncompressed 79.96 kB - -
CDN Bundle (incl. Tracing) - uncompressed 125.34 kB - -
CDN Bundle (incl. Tracing, Replay) - uncompressed 241.37 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback) - uncompressed 254.13 kB - -
@sentry/nextjs (client) 45.96 kB - -
@sentry/sveltekit (client) 41.9 kB - -
@sentry/node-core 51.27 kB +0.16% +79 B 🔺
@sentry/node 159.44 kB +0.06% +82 B 🔺
@sentry/node - without tracing 92.85 kB +0.04% +28 B 🔺
@sentry/aws-serverless 108.14 kB +0.06% +63 B 🔺

View base workflow run

@github-actions
Copy link
Contributor

github-actions bot commented Nov 26, 2025

node-overhead report 🧳

Note: This is a synthetic benchmark with a minimal express app and does not necessarily reflect the real-world performance impact in an application.

Scenario Requests/s % of Baseline Prev. Requests/s Change %
GET Baseline 8,818 - 11,412 -23%
GET With Sentry 1,687 19% 2,119 -20%
GET With Sentry (error only) 6,029 68% 7,784 -23%
POST Baseline 1,164 - 1,228 -5%
POST With Sentry 559 48% 630 -11%
POST With Sentry (error only) 1,047 90% 1,090 -4%
MYSQL Baseline 3,335 - 4,127 -19%
MYSQL With Sentry 431 13% 590 -27%
MYSQL With Sentry (error only) 2,702 81% 3,404 -21%

View base workflow run


const spanOptions = getSpanOptions(options);

if (!hasSpansEnabled()) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

h: This here looks like almost the same code as above - even the comments. Can this be moved into a reusable method?

The only different what I've seen is that the one above has () => span.end() in the handleCallbackErrors, not sure if this was intended.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I opted not to de-duplicate here. In general startSpan and startSpanManual are basically identical other than how they end spans (auto-end vs manual-end).

Splitting this out into a common helper makes it harder to understand because of how the callback and the success callback are handled differently. I think it's a complicated abstraction for little gain.

Tbh I would rather not deduplicate here, the code is already easy to get lost in without having an extra abstraction in it.

What do you think?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I ended up updating this, I guess we'll hardly ever see that much diversion between the two apis, so it probably doesn't hurt to keep this one dry.

const spanOptions = getSpanOptions(options);

const span = tracer.startSpan(name, spanOptions, ctx);
if (!hasSpansEnabled()) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

l: what about following to have the same return (so we don't have to touch two parts of the code in case the return changes (or a nested ternary):

let context = ctx;

if (!hasSpansEnabled()) {
  context = isTracingSuppressed(ctx) ? ctx : suppressTracing(ctx);
}

return tracer.startSpan(name, spanOptions, context);

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated.

Copy link
Collaborator

@logaretm logaretm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice one, I was going to raise up the duplication thing but your argument makes sense.

@andreiborza andreiborza marked this pull request as draft November 26, 2025 16:59
@andreiborza
Copy link
Member Author

Drafting this for now, will investigate the failing log tests next week.

@andreiborza andreiborza marked this pull request as ready for review December 1, 2025 13:17
@andreiborza
Copy link
Member Author

The integration tests were previously failing because with suppressed spans, the trace id lookup fell back to the scope. Scope.clone() doesn't generate a new trace id so we would always get the same trace id regardless of withIsolationScope calls.

I updated the integration tests to specifically enable tracing and added an e2e tests to showcase different trace Ids in a custom OTel setup that has a http instrumentation, so we get different traces via the spans created from that instrumentation.

We were briefly discussing how scopes should handle trace ids, but landed on not changing behavior for now.

@andreiborza andreiborza merged commit db32055 into develop Dec 1, 2025
203 checks passed
@andreiborza andreiborza deleted the ab/noop-span branch December 1, 2025 16:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Ensure we noop (NonRecordingSpan) in Sentry.startSpanXXX APIs when hasSpansEnabled() === false

4 participants