Search squid archive

Re: static caching for specific website for specific time

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 20/02/2014 1:11 a.m., Dr.x wrote:
> hi all ,
> i found that sites like aaaa.com as an example is  the most site being viewd
> by customers.
> the question is how do i use squid with refresh pattern so that it  cache
> all that website for about 5 minutes and after that go agian cache it , and
> so on .
> 
> 
> if i do that im sure that byte ratio will be better

It probably won't be. Or maybe the ratio will but the customers
experience drop through the floor, so they move on elsewhere and you
haev to start all over again.


Begin by evaluating for each of the popular URLs:

 a) whether it's an ongoing or temporary trend.
  - optimizing closely for todays fad is often useless on tomorrows.
This is why caches have dynamic timing heuristics and do revalidation.
*Let it*.


 b) whether or not the resource/object is cacheable at all.
  - private, no-store, and no-transform exist for good reasons and are
NOT usually used by default on web servers or frameworks. If they are
present the author is making a clear statement about traffic safety.
Touching is BAD. It actually wastes a lot of bandwidth and transfer time
when authors having to re-override proxies which ignore those settings.


 c) Pay attention to the caching control headers.
  - If you see complicated settings chances are VERY high that someone
is actively tuning it. So touching it yourself without *full* knowledge
of the site and its behaviour patterns will break things.


 d) Whether forcing more caching for that site/service will bump
something useful out of storage.
  - see (a).



PS. remember that Squid can and *does* cache objects even when they are
marked "no-cache" or with Authenticate: headers. No need for special
overrides on those.


HTH
Amos




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux