-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
fix(aws-serverless): Only start root span in Sentry wrapper if Otel didn't wrap handler #12407
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
const event = {}; | ||
const context = { | ||
invokedFunctionArn: 'arn:aws:lambda:us-east-1:123453789012:function:my-lambda', | ||
functionName: 'my-lambda', | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
minimal event and context objects that are usually passed to the lambda function by AWS
@@ -2,6 +2,6 @@ import { startEventProxyServer } from '@sentry-internal/test-utils'; | |||
|
|||
startEventProxyServer({ | |||
port: 3031, | |||
proxyServerName: 'aws-serverless-lambda-layer', | |||
proxyServerName: 'aws-serverless-lambda-layer-cjs', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just a small renaming of the test application to more clearly show what it is testing
data: { | ||
'sentry.sample_rate': 1, | ||
'sentry.source': 'custom', | ||
'sentry.origin': 'auto.otel.aws-lambda', | ||
'cloud.account.id': '123453789012', | ||
'faas.id': 'arn:aws:lambda:us-east-1:123453789012:function:my-lambda', | ||
'otel.kind': 'SERVER', | ||
}, | ||
origin: 'auto.otel.aws-lambda', | ||
span_id: expect.any(String), | ||
status: 'ok', | ||
trace_id: expect.any(String), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
as we can see here, the root span is missing sentry.op
/op
data. I'll open a follow up PR to add it to the root span similarly to what we do in our other integrations
* Checks if Otel's AWSLambda instrumentation successfully wrapped the handler. | ||
* Check taken from @opentelemetry/core | ||
*/ | ||
function isWrappedByOtel( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
figured I'd just pull in this function rather than declaring opentelemetry/core
a dependency of aws-serverless. Happy to change if reviewers prefer it!
// Only start a trace and root span if the handler is not already wrapped by Otel instrumentation | ||
// Otherwise, we create two root spans (one from otel, one from our wrapper). | ||
// If Otel instrumentation didn't work or was filtered by users, we still want to trace the handler. | ||
if (options.startTrace && !isWrappedByOtel(handler)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One thing to note here: Checking for getActiveSpan
doesn't work because apparently, our patch is applied after Otel's patch is applied, meaning, it's executed first, resulting in no active span. So I opted to check for the otel patch instead. I still think we should prefer Otel's span though because it contains more data than ours.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice work!
bd4b257
to
03b29f2
Compare
…idn't wrap handler (#12407) Fix span collision by checking if the AWS lambda handler was already wrapped by Otel instrumentation and only if not starting our own root span. The rest of our handler is still being used (i.e. the flushing logic, error capturing, etc) regardless of otel wrapping. Also Adjusted E2E tests to: - more closely resemble the AWS environment - enable auto patching of the handler with the Sentry SDK handler - better check for Otel and manual spans Ref #12409
As discovered yesterday, we have a bit of a collision with our lambda function wrapper and Otel's wrapper. Specifically, both started a new root span when the lambda function handler was invoked, causing two transactions being sent instead of one.
This PR fixes this collision by checking if the handler was already wrapped by Otel instrumentation and only if not starting our own root span. The rest of our handler is still being used (i.e. the flushing logic, error capturing, etc) regardless of otel wrapping.
Adjusted E2E tests to:
Ref #12409