Search squid archive

Re: Re: Non-permanent Internet Connection Question

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I just need to see how I can set it so I don't care what the page's
cache policy is, I want to cache the page.  It seems like certain pages
are not getting cached and I want to know how to force loading the
content into the cache.  If I could get it to check every time a page is
loaded to see if there is a difference from the cache version to the
real version then update the cache.  That is what I am looking for.


Blake Grover
IT Manager
EZ-NetTools

www.eznettools.com
800-627-4780 X2003

EZ-NetTools - We make it easy!


On Sat, 2007-09-22 at 14:16 +1200, Amos Jeffries wrote:
> RW wrote:
> > On Fri, 21 Sep 2007 07:36:05 -0600
> > Blake Grover <blake@xxxxxxxxxxxxxx> wrote:
> > 
> >> We are working on a new project where we will distribute Linux
> >> machines in different areas that will be connected to the Internet.
> >> But these machines might not have a Internet connection always.  We
> >> would like these machines to show certain web pages from a web server
> >> on a loop. For example I have 7 pages that jump to one to another
> >> after 7 - 10 seconds.  But if the Internet connection goes down we
> >> want squid to keep showing the loop of HTML pages until the
> >> connection get restored and then squid could update the pages in the
> >> cache. 
> > 
> > 
> > You could write a script to switch squid into offline mode when the
> > connection goes down, but there will always be race condition problems
> > with this.
> > 
> > Have you considered running local webservers instead?
> > 
> 
> What I'd do is check to see if the following works (Note I have not 
> tested any of this)
> 
>   - use a deny_info override for that particular ERROR_PAGE
>   - the new errorpage to refresh to the next slide-show page in sequence.
> 
> If that works, any pages broken during the downtime will simply be 
> skipped in favour of pages that do work.
> 
> You will most likely need a small http daemon/script to provide the new 
> deny_info page and keep track of what was meant to be next.
> 
> Amos


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux