Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Support custom cache control #82

Closed
SukkaW opened this issue Jul 17, 2020 · 11 comments · Fixed by #117
Closed

[Feature Request] Support custom cache control #82

SukkaW opened this issue Jul 17, 2020 · 11 comments · Fixed by #117

Comments

@SukkaW
Copy link
Contributor

SukkaW commented Jul 17, 2020

Is your feature request related to a problem? Please describe.

Talking about rate limiter, it is better to have a customizable cache-control. E.g. I will use longer cache-control for repo card (extra pin). It is also helpful for people trying to avoid rate limiting.

Describe the solution you'd like

Add a parameter &cache_control=<maxage>. But the minimum maxage should retain 1800 to avoid abusement.

Additional context

Shields.IO has parameter ?cacheSeconds= support, and they are using 3600s as default cache-control.

@SukkaW
Copy link
Contributor Author

SukkaW commented Jul 17, 2020

I could help to implement the feature based on #58

@anuraghazra
Copy link
Owner

Hi @SukkaW thanks for requesting the feature, Yes i think it would be a good feature but here are some things to note if you are worried about rate limiting :-

  1. Currently we have 8 PATs that means we can handle 40k requests per hour
  2. The base cache is 30mins that means each user would only do 2 requests per hour
  3. Github API limit will reset every 1hour and we've set cache to 30min

So i think the rate limiting isn't a big issue for now 🤔

@SukkaW
Copy link
Contributor Author

SukkaW commented Jul 17, 2020

So i think the rate limiting isn't a big issue for now 🤔

Speed is also a problem.

GitHub Camo is using Fastly CDN and do respect the cache-control from the original image server. If I could control cache-control, I will be able to make the Repo Card or Stat Card cached at Fastly for a longer time.

@filiptronicek
Copy link
Contributor

I think it would be a nice feature, we could let users set the cache in minutes (my suggestion would be to make the minimum 30 (as @SukkaW suggested) and a maximum of 720 (12 hours)). This would also help with really large repos, or if someone very popular implemented this code. As @SukkaW also previously noted, speed and performance should be also a big concern. If someone has a private instance, it is really easy to hit the rate limit if they are e.g. updating their README and previewing the results in real-time.

With the repo cards, we could implement some sort of an "auto cache" feature: if a repo has more than 1K stars/forks, the count doesn't need to be updated as frequently, because the numbers are truncated.

@anuraghazra
Copy link
Owner

I like @filiptronicek's suggestion to make the cache longer if the repo card has over 1k stars

@SukkaW
Copy link
Contributor Author

SukkaW commented Jul 17, 2020

if the repo card has over 1k stars

Should set longer cache only when both star and fork over 1k.

@filiptronicek
Copy link
Contributor

if the repo card has over 1k stars

Should set longer cache only when both star and fork over 1k.

Agreed, wanted to write that in the original post, but forgot. Although we could find a balance between them because on some repos it is true, on most forks aren't climbing that steadily either. But for now, we should only check if they both are over a thousand

@nombrekeff
Copy link
Contributor

I like that idea @filiptronicek, Sound really reasonable. There is no need to update if the value it's truncated

@anuraghazra
Copy link
Owner

@SukkaW you can now set custom cache with ?cache_seconds=NUMBER parameter :shipit:

@nombrekeff
Copy link
Contributor

Cool, I'm going to add it to the theme live previews

@nombrekeff
Copy link
Contributor

Never mind, you already did it 💯

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants