Search squid archive

RE: Fail-Over Site Hosting

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Thank you, Amos. We're doing this as a one-off, but will repeat its use when
similar maintenance requirements necessitate downtime in the future.

For the stale-if-error statements, what does the actual statement look like
in practice, please?

There are no real case examples online yet from what I could find. Is it as
simple as adding the line: "stale-if-error"?

Does one add a "> <milliseconds>" attribute at the end?

Wouldn't "offline_mode on" work just the same?

Also, what are the command lines needed in order to "triage" the pages in
need of caching, please, as we definitely need this kind of control over the
portions of our site that are to be kept available during the downtime?

Many, many thanks,
Philip 
 

-----Original Message-----
From: Amos Jeffries [mailto:squid3@xxxxxxxxxxxxx] 
Sent: Thursday, February 26, 2009 11:25 PM
To: Philip de Souza
Cc: squid-users@xxxxxxxxxxxxxxx
Subject: Re:  Fail-Over Site Hosting

Philip de Souza wrote:
> Hello,
> 
> Can Squid-Cache be used in a Windows 2003 / IIS 6, web-hosting environment
> to provide a temporary means by which the actual web server itself can be
> brought down for maintenance, so that the site appears to remain live to
any
> clients wishing to access it?
> 

Possibly.

> 
> We have two hosted machines with our web host provider, but were not able
to
> configure Network Load Balancing due to SSL cert.'s being needed on some
of
> the sites the second server is now hosting (thereby negating our ability
to
> use the 2 NICs on each machine we are limited to).

Huh? SSL certs should not be needed for port 80 access...

> 
> The site we have running on the first server is a vital one, but the
machine
> needs maintenance and so we're searching for a solution that could work
for
> a period of 30-60 minutes, no more. Can Squid accommodate this, even for a
> MS platform such as ours?
> 
> Been searching all the documentation I could find and could not find the
> answer I need on this. many thanks in advance!
> 
> 
> ~Philip 
> 

Um, is this a once-off. Or a long term backup mechanism?

Static content, or dynamic content that can be cached for the 30-60 
minutes without causing trouble is easy.  Any purely dynamic content 
that cannot be cached at all may have trouble.


To do this with Squid you would need version 2.7, with a cache large 
enough to hold most of the site and at minimum all of the in-use pages 
and files. Set squid up as a web accelerator for the machine thats going 
down. Then shift DNS over to it as primary server.
   http://wiki.squid-cache.org/ConfigExamples

Once that is done the real prep can be started to enable caching of the 
site during the down time. Set up stale-if-error on all objects from the 
hosted sites so they can be served out of the cache even if too old 
during the downtime.

Then use a spider (wget etc) to pull the important website(s) into 
Squid's cache fully just before bringing the web server down.

With all clients going through Squid for some period leading up to the 
outage, there should be no major disruption.

Problems you may face are:
  - not able to cache the whole website (triage the pages to those 
needed most)
  - dynamic pages not being changeable during the outage.


Amos
-- 
Please be using
   Current Stable Squid 2.7.STABLE6 or 3.0.STABLE13
   Current Beta Squid 3.1.0.5


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux