-
Notifications
You must be signed in to change notification settings - Fork 89
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DO NOT MERGE] feat: switch to memcache-client over memcached #6119
base: main
Are you sure you want to change the base?
Conversation
memcache-clientAuthor: Joel Chen Description: NodeJS memcached client Homepage: https://github.com/electrode-io/memcache#readme
New dependencies added: |
client.on(event, () => verbose(`[Cache] ${event}`)) | ||
console.log("hey") | ||
const newClient = new MemcacheClient({ | ||
server: { server: MEMCACHED_URL, maxConnections: MEMCACHED_MAX_POOL }, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not sure if maxConnections
is this pool config, there doesn't seem to be any connection pool config in the new lib
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It does seem to be an internal maximum on the number of connections made by this client. (I was worried it might be a server option due to its location.)
}) | ||
VerboseEvents.forEach((event) => { | ||
client.on(event, () => verbose(`[Cache] ${event}`)) | ||
console.log("hey") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
was just making sure this was a singleton
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👋
@@ -108,15 +98,18 @@ function _get<T>(key) { | |||
reject(err) | |||
} else if (data) { | |||
if (CACHE_COMPRESSION_DISABLED) { | |||
resolve(JSON.parse(data.toString())) | |||
resolve(JSON.parse(data.value.toString())) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
one API change, the returned data is an object with some metadata, value
contains the actual cached object.
@@ -146,7 +139,7 @@ function _set(key, data, options: CacheOptions) { | |||
return new Promise<void>((resolve, reject) => { | |||
const payload = JSON.stringify(data) | |||
verbose(`CACHE SET: ${cacheKey(key)}: ${payload}`) | |||
client.set(cacheKey(key), payload, cacheTtl, (err) => { | |||
client.set(cacheKey(key), payload, { lifetime: cacheTtl }, (err) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
slight difference for set
@@ -74,6 +74,5 @@ export function createCacheTracer() { | |||
return { | |||
get: createCommand("get"), | |||
set: createCommand("set"), | |||
delete: createCommand("delete"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't see any use case in the codebase outside of tests for delete
, and you see we renamed client.del
to delete
. Which seems non-standard as both clients implement del
, and that's what the rate limit middleware wants as well - https://github.com/express-rate-limit/rate-limit-memcached?tab=readme-ov-file#client
So perhaps the del
part of rate limiter, whatever that was, never quite worked? Then again that middleware is disabled anyway
resolved "https://registry.yarnpkg.com/memcache-parser/-/memcache-parser-1.0.1.tgz#66b8434e1bccd79a6da4bfe5fec41e5aa6f585b8" | ||
integrity sha512-F5TjjHKUNMCWTVfiJD+7M9ecNbBdnK6xifYZ1WL8PoQ/dP9sB37VQV8hPtyuUqJJFj76qY70Ezj9saQ0gw2FJg== | ||
|
||
memcached@^2.2.2: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like the rate limit middleware still brings in memcached
...maybe we should just clean that middleware up?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I dug through slack for some history and couldn't figure out the state of this, or if it was ever enabled. Ideally, rate-limit-memcached
should be removed (and re-added if/when needed). I also wonder if the presence of both libraries could interfere with Datadog's tracing.
Which reminds me: we must confirm that datadog automatically instruments this alternate library.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we do remove rate-limit-memcached
, we can also remove the mock increment
implementation in cache.ts
above.
This just seems to work, at least locally, so we can iterate on the right options (
cmdTimeout
and such).Going for do not merge for now since we want to make sure we're using the right configs/variables in a tunable way, monitor carefully, etc. - can discuss.