Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not doing any browser-side caching #570

Open
gitblit opened this issue Aug 12, 2015 · 10 comments
Open

Not doing any browser-side caching #570

gitblit opened this issue Aug 12, 2015 · 10 comments

Comments

@gitblit
Copy link
Collaborator

gitblit commented Aug 12, 2015

Originally reported on Google Code with ID 274

What steps will reproduce the problem?
1. $ curl -I "https://git.wikimedia.org/"

What is the expected output? What do you see instead?
It'd be nice if Gitblit would allow the browser to cache some of the output for awhile
rather than re-sending it. Don't want to cache for too long, but some period of time
would be nice. Right now we issue "Pragma: no-cache"
 / "Cache-Control: no-cache, max-age=0, must-revalidate"

What version of the product are you using? On what operating system?
1.3.1-SNAPSHOT, Ubuntu 12.04.2 LTS

Please provide any additional information below.
Downstream bug: https://bugzilla.wikimedia.org/49371

Reported by chorohoe@wikimedia.org on 2013-07-17 23:26:35

@gitblit
Copy link
Collaborator Author

gitblit commented Aug 12, 2015

So what would be reasonable, in your opinion?

Reported by James.Moger on 2013-07-17 23:32:42

@gitblit
Copy link
Collaborator Author

gitblit commented Aug 12, 2015

That's a good question, I was mainly just passing the issue along. I'll ask on my end
and see if anyone has any good ideas :)

Reported by chorohoe@wikimedia.org on 2013-07-17 23:34:42

@gitblit
Copy link
Collaborator Author

gitblit commented Aug 12, 2015

Hi Chad,

The good news.
I went ahead and implemented some page caching and this is immediately available on
master.  By default it is disabled.

You can enable it by setting web.pageCacheExpires=5 for a 5 minute expiration; all
expirations are in minutes, not seconds or msecs.  The algorithm uses a last-modified
date of the resource relevant to the page request.

Cache-Control: private, must-revalidate
Last-Modified: based on requested resource, minimum of Gitblit boot date
Expires: now + expires_setting

The bad news.
This works great for Firefox and IE BUT Chrome/Webkit _always_ requests a new page
from the server instead of using the cache and I can not figure out why.  Since you
guys are web page experts maybe you can tell me what is wrong. :)  Maybe it is a bug
in Chrome.  Maybe it's a bug in Gitblit.  Either way this is either an improvement
or at least not a regression.

Feedback requested.

Reported by James.Moger on 2013-07-19 15:24:33

  • Labels added: Milestone-1.3.1

@gitblit
Copy link
Collaborator Author

gitblit commented Aug 12, 2015

Ok.  I lied.  Chrome is fulfilling from the cache as long as the url is NOT localhost.
 Weird.  Ok, I think everyone is happy now.  Give it a go and let me know how this
works for you.

Reported by James.Moger on 2013-07-19 15:28:09

@gitblit
Copy link
Collaborator Author

gitblit commented Aug 12, 2015

Fix or change deployed in 1.3.1

Reported by James.Moger on 2013-07-24 19:56:32

@gitblit
Copy link
Collaborator Author

gitblit commented Aug 12, 2015

Reported by James.Moger on 2013-07-24 19:57:04

  • Status changed: Fixed

@gitblit
Copy link
Collaborator Author

gitblit commented Aug 12, 2015

Issue 590 has been merged into this issue.

Reported by James.Moger on 2013-08-19 11:48:26

@gitblit
Copy link
Collaborator Author

gitblit commented Aug 12, 2015

Re-opened for discussion with Wikimedia folks on how to improve caching since the current
strategy is not ideal for their setup:

https://bugzilla.wikimedia.org/show_bug.cgi?id=49371

Reported by James.Moger on 2013-08-19 11:50:02

  • Status changed: Accepted
  • Labels added: Milestone-1.4.0
  • Labels removed: Milestone-1.3.1

@gitblit
Copy link
Collaborator Author

gitblit commented Aug 12, 2015

At Wikimedia, I think what we might want to do is to set far future-expires on all Gitblit
pages, and to write a Gerrit plug-in that runs whenever a patch is merged and that
purges from the cache (Varnish, in our case) any views that need to be updated in light
of that merge. Does that seem viable? Is the set of URLs that need to be purged finite
and small?

Reported by olivneh@wikimedia.org on 2014-02-16 02:38:12

@gitblit
Copy link
Collaborator Author

gitblit commented Aug 12, 2015

Seems viable.

"Is the set of URLs that need to be purged finite and small?"
This is the list of most base urls along with the normal parameters.

mount("/repositories", RepositoriesPage.class);
mount("/overview", OverviewPage.class, "r", "h");
mount("/summary", SummaryPage.class, "r");
mount("/reflog", ReflogPage.class, "r", "h");
mount("/commits", LogPage.class, "r", "h");
mount("/log", LogPage.class, "r", "h");
mount("/tags", TagsPage.class, "r");
mount("/branches", BranchesPage.class, "r");
mount("/commit", CommitPage.class, "r", "h");
mount("/tag", TagPage.class, "r", "h");
mount("/tree", TreePage.class, "r", "h", "f");
mount("/blob", BlobPage.class, "r", "h", "f");
mount("/raw", RawPage.class, "r", "h", "f");
mount("/blobdiff", BlobDiffPage.class, "r", "h", "f");
mount("/commitdiff", CommitDiffPage.class, "r", "h");
mount("/compare", ComparePage.class, "r", "h");
mount("/patch", PatchPage.class, "r", "h", "f");
mount("/history", HistoryPage.class, "r", "h", "f");
mount("/search", GitSearchPage.class);
mount("/metrics", MetricsPage.class, "r");
mount("/blame", BlamePage.class, "r", "h", "f");
mount("/docs", DocsPage.class, "r");
mount("/doc", DocPage.class, "r", "h", "f");
mount("/activity", ActivityPage.class, "r", "h");
mount("/project", ProjectPage.class, "p");
mount("/projects", ProjectsPage.class);

Reported by James.Moger on 2014-02-16 21:11:16

  • Labels removed: Milestone-1.4.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants