Search squid archive

Re: Serving from the cache when the origin server crashes

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Mon, 22 Jun 2009 16:44:34 -0400, Myles Merrell <mmerrell@xxxxxxxxxxxx>
wrote:
> Is it possible to configure squid to continue serving from the cache, 
> even if the originserver has crashed?
> 
> We have squid setup using acceleration through a virtual host.  Squid 
> listens on 80, and our web server works on another server on port 81.  
> Squid serves the majority of pages through the cache, and when it has to 
> it gets them from the server.  We'd like to be able to take the server 
> down periodically, and have the squid cache continue to serve pages in 
> the cache. 
> 
> Is this reasonable?, if so, is it possible?
> 

Sort of. Squid does this routinely for all objects which it can cache. The
state of the backend server is irrelevant for HIT traffic.

I'm sure some of those who deal with high-uptime requirements have more to
add on this. These are just the bits I can think of immediately.

For regular usage make sure that sufficiently long expiry and max-age are
set so things get cached for as long as possible. Also check the cache_peer
monitor* settings are in use. These will greatly reduce minor outages or
load hiccups.

For best affect, the monitor settings and several duplicate parent peers
would be recommended. So that when one peer is detected down Squid simply
makes requests to the next one. Only the requests in action to the first
peer will experience any error.

The newer the Squid (up to 2.HEAD snapshots) the better the tuning and more
options available for this type of usage. Several sponsors have spent a lot
getting 2.7 and 2.HEAD acceleration features added.



For long'ish scheduled outages there are some other settings which can
further reduce impact but take planning to use properly.  When an outage is
being scheduled ensure the max_stale config option has a reasonable but
longer period than you need for downtime.

Give its some time to grab as much content as possible. You may want to run
a sequence of requests for no-so-popular pages that MUST be cached for the
duration.  Then set the inappropriately named offline_mode in Squid just
before dropping the back-end. These will combine to make squid cache as
aggressively as possible and not seek external sources unless absolutely
required.


Amos


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux