Skip to content

feat(nextjs): Use Vercel's waitUntil to defer freezing of Vercel Lambdas #12133

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
May 27, 2024

Conversation

lforst
Copy link
Member

@lforst lforst commented May 21, 2024

Fixes #12045
Fixes #11708

We have quite a bit of await flush() and flush() in the Next.js SDK for the very simple reason that Vercel (which hosts a large number of our user-base) runs on AWS Lambdas which freeze as soon as res.end() is called, essentially stomping any telemetry that we are attempting to flush.

Luckily Vercel recently built a way of preventing the lambdas to freeze, even though the response was .end()ed. The functionality is abstracted through a package called @vercel/functions but the code of the waitUntil function is just a callout to a global that is available on the Vercel JS runtime.

Since we do not really want to depend on said @vercel/functions package we are vendoring that function in and call it with the flush promise whenever we assume that a lambda could be finishing soon.

In the future we could be thinking about the following:

  • Build a transport that makes use of this instead. (We need to be careful though since using a transport would only send stuff that is currently not in event processing)
  • Add a signal handler calling waitUntil(flush()) when we assume the lambda will shut down soon. (Potential drawback is that the signal handler could be called even though a request is currently active, losing telemetry for that request.)

Copy link
Contributor

github-actions bot commented May 27, 2024

size-limit report 📦

Path Size
@sentry/browser 21.74 KB (0%)
@sentry/browser (incl. Tracing) 32.75 KB (0%)
@sentry/browser (incl. Tracing, Replay) 68.22 KB (0%)
@sentry/browser (incl. Tracing, Replay) - with treeshaking flags 61.64 KB (0%)
@sentry/browser (incl. Tracing, Replay with Canvas) 72.27 KB (0%)
@sentry/browser (incl. Tracing, Replay, Feedback) 84.33 KB (0%)
@sentry/browser (incl. Feedback) 37.75 KB (0%)
@sentry/browser (incl. sendFeedback) 26.32 KB (0%)
@sentry/browser (incl. FeedbackAsync) 30.74 KB (0%)
@sentry/react 24.43 KB (0%)
@sentry/react (incl. Tracing) 35.75 KB (0%)
@sentry/vue 25.68 KB (0%)
@sentry/vue (incl. Tracing) 34.56 KB (0%)
@sentry/svelte 21.88 KB (0%)
CDN Bundle 24.28 KB (0%)
CDN Bundle (incl. Tracing) 34.22 KB (0%)
CDN Bundle (incl. Tracing, Replay) 68.04 KB (0%)
CDN Bundle (incl. Tracing, Replay, Feedback) 73.05 KB (0%)
CDN Bundle - uncompressed 71.46 KB (0%)
CDN Bundle (incl. Tracing) - uncompressed 101.58 KB (0%)
CDN Bundle (incl. Tracing, Replay) - uncompressed 211.48 KB (0%)
CDN Bundle (incl. Tracing, Replay, Feedback) - uncompressed 223.86 KB (0%)
@sentry/nextjs (client) 35.1 KB (0%)
@sentry/sveltekit (client) 33.35 KB (0%)
@sentry/node 114.6 KB (-0.01% 🔽)
@sentry/aws-serverless 103.29 KB (-0.01% 🔽)
@sentry/browser (incl. Tracing, Replay, Feedback, metrics) 85.8 KB (added)
@sentry/browser (incl. metrics) 23.13 KB (added)

@lforst lforst marked this pull request as ready for review May 27, 2024 09:07
@lforst lforst requested review from mydea, Lms24 and andreiborza May 27, 2024 09:07
Copy link
Member

@Lms24 Lms24 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Discussed this offline, the approach LGTM! Great work!

// In case this is being run as part of a serverless function (as is the case with the server half of nextjs apps
// deployed to vercel), make sure the error gets sent to Sentry before the lambda exits.
await flushQueue();
vercelWaitUntil(flushSafelyWithTimeout());
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, could we possibly still encapsulate this in some flushQueue util or so? Feels a bit "dirty" to have all these vercelXXX calls peppered throughout the code base, I think I'd prefer it if we could abstract this away a bit 🤔

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tbh it feels more dirty to me to mangle Vercel API with our own intentions. vercelWaitUntil is basically just the vendored waitUntil. Also, it's not like this is not readable or anything.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We literally call vercelWaitUntil(flushSafelyWithTimeout()); every time where we are calling this, as far as I see? 😅

I think it would be nicer to still just call flushQueue everywhere (Which can literally be: function flushQueue() { vercelWaitUntil(flushSafelyWithTimeout()); } now), call this everywhere. if we need to adjust this logic (e.g. it does more or less or whatever) we can encapsulate this there, no need to repeat this everywhere - we call this 9 times now?

But no strong feelings, if you prefer it this way all good for me :)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see the upside so I'll leave it as is.

@lforst lforst changed the title feat(nextjs): Use Vercel's waitUntil to defer freezing of function feat(nextjs): Use Vercel's waitUntil to defer freezing of Vercel Lambdas May 27, 2024
Co-authored-by: Francesco Novy <francesco.novy@sentry.io>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants