Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: drop cachetools dependency in favor of simple local implementation #1590

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

akx
Copy link
Contributor

@akx akx commented Aug 26, 2024

This PR drops the dependency on cachetools, which was only used for its LRU cache class, in favor of a simple local implementation.

This should have a small but positive effect on many downstream users given how many times this library is downloaded per day.

@akx akx requested review from a team as code owners August 26, 2024 11:01
@akx akx force-pushed the drop-cachetools-dep branch from 9d4fb8d to 9ca2c14 Compare August 26, 2024 11:02
@clundin25
Copy link
Contributor

Thank you @akx. Is using the standard library a reasonable alternative? https://docs.python.org/3/library/functools.html#functools.lru_cache

@akx
Copy link
Contributor Author

akx commented Sep 3, 2024

@clundin25 Unfortunately not. The transparency of a lru_cached function is not really compatible with how the client code here is e. g. doing if foo in cache:.

@akx
Copy link
Contributor Author

akx commented Sep 11, 2024

@clundin25 Is there anything I can do to help this move along?

@westarle
Copy link

How big is the change to update the client code so it can use the standard library and avoid maintaining this implementation?

/cc @sai-sunder-s

@akx akx force-pushed the drop-cachetools-dep branch from 9ca2c14 to e2364a3 Compare October 7, 2024 10:10
@akx
Copy link
Contributor Author

akx commented Oct 7, 2024

How big is the change to update the client code so it can use the standard library and avoid maintaining this implementation?

Sorry, I wasn't sure whether that was directed at me or the person cc'd.

Not trivial, since it depends on being able to introspect the cache entry before deciding whether it's usable. lru_cache does not expose such a mechanism (nor being able to replace a single key from the cache, which is also required).

token, expiry = self._cache.get(audience, (None, None))
if token is None or expiry < _helpers.utcnow():

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants