Search squid archive

Re: caching for 60 minutes, ignoring any header

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



My example of favicons was to simplify the question. The real case is different. I want to cache all "favicons" (that is, other resources, internally used) for 60 minutes.
For a given "favicon", I'd like to have the following caching policy:
The period of 60 minutes should start when the first consumer consumes the favicon. Let's mark the time for that first request as T0 (T Zero). During T0 until T0+60minutes, this favicon should be considered as "fresh", in terms of caching. After T0+60minutes, this favicon should be considered as "stale", in terms of caching, and should be re-fetched by Squid, upon request. The favicon would be cached even if the original server explicitly instructed not to cache nor store the favicon. Yes, I know it might be considered a bad practice, and perhaps illegal to some readers, but I assure you that the other servers (the real web servers) that provide the responses, are business partners and they gave me their approval to override their caching policy. However, they don't want to change their configuration and it's totally up to me to create my caching layer.

And another thing: the clients are not web browsers. The clients consuming these resources ("favicons" for sake of simplicity) are software components using HTTP as their transport protocol.

Thanks for any advice on the subject.


On 23-Sep-13 06:43, Eliezer Croitoru wrote:
Youd better leave it on the default since most browsers will cache it
automatically.
a HIT can be a vary of HITs like TCP_IMS_HIT etc and not just a TCP_HIT.
You also need to understand how squid does the cache and override.
How is it goes without the refresh_pattern?
why would you want to force it on all sites when many of them has far
more longer cache headers then 60 min?

Eliezer

On 09/23/2013 06:01 AM, Ron Klein wrote:
Hi,

I'm trying to cache all favicons files, named favicon.ico, located
always in the root of the web site (when they exist, of course)
I would like to ignore any caching instruction originates from the
(real) web server response headers.
For instance, if I get the "last modified" header, I'd like to ignore it.
I want the caching policy to be purely "mine".

I use Squid 3 on Ubuntu 12.04 .
I created the following instruction in the configuration file:
refresh_pattern -i ^http(s?)://.*/favicon.ico$   60      0% 60
ignore-private override-expire override-lastmod ignore-no-store

My question:
Is this the correct instruction? I think not, since I get "HIT" response
headers even after one hour of caching.

Thanks!





[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux