-
-
Notifications
You must be signed in to change notification settings - Fork 47
feat(middleware): implement onboarding step cache #2682
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Conversation
This solution is very simplistic and does not rely on reddis on purpose. This is so that it saves another "round-trip" at every single request This choice is because the middleware should be the fastest possible
Here's the code health analysis summary for commits Analysis Summary
|
Coverage Report
File Coverage
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm, please wait for @Meierschlumpf since he has more experience with Next.js.
Switch to using redis to cache this value (increases round trip time but it should still be faster than trpc
I think the roundtrip time should be pretty similar, no?
I believe that redis cache interactions are near instantaneous since the connection is kept alive whereas trpc queries the DB again |
@manuel-rw b7ba3c6 is an optional commit that tries to also cache the "Culture" attribute for a user. When making these changes in dev mode I saw a very significant change in the speed taken to switch pages. From ~450ms to < 50ms with the cached values. I believe this will also enhance prod significantly |
lgtm, please wait for Meier |
I would suggest that we use the new experimental feature for node-middleware. Because then we can directly query the db and don't need to use fetch for it. For this you'll need 15.2+ which is available once #2701 is merged https://nextjs.org/docs/app/building-your-application/routing/middleware#runtime |
Then we'll probably also don't need a cache |
I don't understand how that will fix this.. Currently our middleware is running on node, since we deploy as a standalone package we are never using an edge server. Querying the DB will still have a similar latency ± 100ms. We could however use an existing redis singleton connection to reach into redis directly |
No middleware did not support node packages, so it will have quite an increase. Of course we can discuss if we want to cache it further once we tried the node middleware |
Node middleware is currently only available in canary |
Storing it with the csrf token will result in the stale cache still existing as long as you don't login / logout again. So not really a good way to cache it. I would consider waiting for nodeMiddleware and using |
This solution is very simplistic and does not rely on redis on purpose.
This is so that it saves another "round-trip" at every single request
This choice is because the middleware should be the fastest possible im my opinion
Decision taken
let
to define a variable that lives in the middleware (kept alive by nextjs since we are not deploying our API to functions)Tests
Reduced the time between each page switch by about 45ms on my machine