Search squid archive

Re: Caching query strings

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Po Ki Chui wrote:
Hi all,

I just installed Squid 3.0 on Redhat Linux. I am not able to cache any URLs with a query string, unless the browser cache is clear. I've gone thru the forum posts, but still no luck.


============================================================
My squid.conf has the following:

cache allow all
refresh_pattern \?       0       20%     30
refresh_pattern .         0       20%     4320

There's no default "QUERY" cache deny to be commented out, and all I have is cache allow all.

============================================================
The requested page returns an Expires header that doesn't expire:

HTTP/1.x 200 OK
Server: Apache-Coyote/1.1
Expires: Wed, 14 May 2008 00:00:00 GMT
Content-Type: text/html;charset=UTF-8
Date: Mon, 12 May 2008 21:18:35 GMT
X-Cache: MISS from xxx.xxx.com
Via: 1.0 xxx.xxx.com (squid/3.0.STABLE5)
Connection: close
============================================================


Please advise.

We are currently recommending the config, which keeps to the standards by not caching dynamic data when expiry info is absent, but caching as long as possible when it is:

  # part 1: remove the old QUERY acl and cache deny QUERY.

  # making these the last two refresh_patterns.
  refresh_pattern (/cgi-bin/|\?)  0   0%    0
  refresh_pattern .               0  20% 4320


If that is still not working for a particular site, check whether the site itself is permitting caching and why not.
  http://www.ircache.net/cgi-bin/cacheability.py


Amos
--
Please use Squid 2.6.STABLE20 or 3.0.STABLE5

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux