Search squid archive

Re: How to not cache a site?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Jerome Yanga wrote:
Thanks for the quick response, Chris.

Here are my attempts to answer your questions.  :)


Using Live HTTP Headers plugin for Firefox.  It seems to show that Cache-Control and Pragma settings.

http://site_address.com/help/jssamples_start.htm

GET /help/jssamples_start.htm HTTP/1.1
Host: site_address.com
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.14) Gecko/20080404 Firefox/2.0.0.14
Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Cookie: CFID=1234567890; CFTOKEN=1234567890; SESSIONID=1234567890; __utma=11111111.111111111.111111111.111111111.111111111.3; __utmc=111111111; __utmz=111111111.111111111.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); __utmb=111111111.4.10. 111111111

HTTP/1.x 200 OK
Date: Thu, 05 Jun 2008 23:41:00 GMT
Server: Apache
Last-Modified: Thu, 05 Jun 2008 09:03:27 GMT
Etag: "111111111-111111111-111111111"
Accept-Ranges: bytes
Content-Type: text/html; charset=UTF-8
Cache-Control: no-store, no-cache, must-revalidate, max-age=0
Expires: Thu, 05 Jun 2008 23:41:00 GMT

These two lines ("Cache-Control: no-store", and an Expires with the same time as the request) should stop any (compliant) shared cache from caching the content. Have you modified the refresh_pattern in your squid.conf?

Vary: Accept-Encoding,User-Agent
Content-Encoding: gzip
Pragma: no-cache
Content-Length: 811
Connection: keep-alive


I purge the cache using a purge command.

#file /cache/usr/bin/purge
/cache/usr/bin/purge: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), for GNU/Linux 2.2.5, dynamically linked (uses shared libs), not stripped

...and the syntax I use is below.

#/cache/usr/bin/purge -n -v -c /etc/squid/cachepurge.conf -p 127.0.0.1:80 -P 1 -e site_address\.com > /var/log/site_address.com_purge.log

I grep'ed the log created from the command above and I can find instances of site_address.com being deleted.  Hence, it is being cached.

Have you checked the headers returned with requests for those objects that are being cached?

I have also reviewed the access.log and I found a some TCP_MEM_HIT:NONE, TCP_REFRESH_HIT, TCP_IMS_HIT, TCP_HIT, TCP_REFRESH_MISS.

Same story here, have you verified the headers on these objects? Especially the objects that result in TCP_REFRESH_HIT and TCP_IMS_HIT as (I think) those are requests that are being validated with the origin server.

I cannot review the store.log as it is disabled.

I shall try the syntax you have provided on the next available downtime.

acl cacheDenyAclName dstdomain .site_address.com acl otherCacheDenyAclName urlpath_regex ^/help/ cache deny cacheDenyAclName otherCacheDenyAclName
Thanks again, Chris.

Regards,
Jerome

Chris

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux