Add a web cache in front of xwiki?

I was looking for web caches and proxies and found http://www.squid-cache.org/

Then doing some further researches on the subject, I stumbled upon this ServerFault question: proxy - Why would I use Squid? - Server Fault, where it mentions Wikipedia using a web cache:

For instance, Wikipedia uses caches (squid and others) as a content accelerator, so that every hit on a page doesn’t make it be regenerated from the database through a lot of PHP. When pages are changed, the cache entry is invalidated.

Is it worth it to add a separate web cache (Squid or another one) in front of XWiki? Has it ever been tried?

I remember having read the word cache several times here, but searching for that term both on the forum here and on xwiki.org didn’t lead me to significant results, I only found:

  • a Page Cache mentioned (here) (quite rich, but not exactly targeted at page caching, but more at performance hits and relations with PC)
  • a configurable Document Cache
  • a Cache Macro and Module

Hello @watery,

It can definitely be done. For instance, xwiki.org is served through a reverse proxy.

As far as I know, we don’t have specific documentation for Squid Cache. But, we have some documentation for nginx.

Hope that helps.

Thanks. I do not know nginx, but have some experience with Apache HTTPd, by reading the page you linked, it looks like mostly static data (javascript, images, and the like) are being cached somehow, plus some content being compressed, am I right?

We’re serving XWiki straight off a Tomcat instance toward our intranet, hence I was more curious about caching whole pages, most of all those using macros like the Jira Macro, that query another system - I don’t know whether that’s even possible, of course.

Back here while still not having done any testing, but being in information retrieval phase.

Though it not yet clear to me, it looks like Squid is more of a forward proxy, while a reverse proxy should fit here. One of those is Varnish: has anyone tried to put it in front of XWiki?

Is there any authentication or authorization info being exchanged between the browser and the XWiki instance?
I’ve only seen the login cookie being sent from the client when viewing the home page, but maybe I’m missing something.

I’m thinking about how to face caching the same page when accessed by two users, one which has the rights to view it (e.g. because he’s part of a specific group) and the other user who cannot view it (because she belongs to another group).

Or maybe there are REST API resources (here) that can be used to check whether the user is allowed to see a particular page?