Search squid archive

Re: How to configure squid so it serves stale web pages when Internet Down

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, 22 Nov 2011 11:57:15 -0500, Doug Karl wrote:
We are trying to configure Squid for installation in school labs in
Belize, Central America where the Internet routinely goes down for
several minutes and sometimes an hour at a time. We are very happy to
serve up stales pages to the children for their classroom session. So
we need to either: (1) Configure squid to handle such situations where
cached pages are simply served stale when the Internet is down (i.e.
don't have Internet to verify freshness) or (2) Have squid respond to
a script that detects the Internet to be down telling it to serve up
stales pages when Internet down.  As configured, our Squid
implementation will not serve stale pages because it tries to access
the original Web site and the cached pages are not served up at all.

NOTE: We have tried "Squid Off-line mode" and that does not work as
you would think as several others reported. SO ARE THERE config
parameters that can make caching work in the presence of bad Internet
connection?

Yes and no.

The key directive _is_ offline_mode. Although the confusing bit is that the mode must be always set to ON for situations like yours. Don't toggle it on/off. All it does is expand the type of things Squid caches to include some which would normally be discarded immediately. It prepares the cache content as best as possible for the second directive...

max_stale - once items already in cache (via offline_mode), this controls how long they may be served for after the Internet connection starts failing. Also there are refresh_pattern max-stale=N options in Squid 2.7 and 3.2 to provide per-URL staleness control. Also HTTP responses from websites can contain max-stale controls telling your Squid its safe to cache and serve while stale.


Note that all of this is determined by the cacheability of the site objects from the start. If an object is not safe to cache and re-use the page which depends on it will break in some way while offline. A lot of sites webmasters do not send cache friendly header and so create sites which break very easily.


Amos



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux