Search squid archive

Re: Re: Non-permanent Internet Connection Question

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



RW wrote:
On Fri, 21 Sep 2007 07:36:05 -0600
Blake Grover <blake@xxxxxxxxxxxxxx> wrote:

We are working on a new project where we will distribute Linux
machines in different areas that will be connected to the Internet.
But these machines might not have a Internet connection always.  We
would like these machines to show certain web pages from a web server
on a loop. For example I have 7 pages that jump to one to another
after 7 - 10 seconds.  But if the Internet connection goes down we
want squid to keep showing the loop of HTML pages until the
connection get restored and then squid could update the pages in the
cache.


You could write a script to switch squid into offline mode when the
connection goes down, but there will always be race condition problems
with this.

Have you considered running local webservers instead?


What I'd do is check to see if the following works (Note I have not tested any of this)

 - use a deny_info override for that particular ERROR_PAGE
 - the new errorpage to refresh to the next slide-show page in sequence.

If that works, any pages broken during the downtime will simply be skipped in favour of pages that do work.

You will most likely need a small http daemon/script to provide the new deny_info page and keep track of what was meant to be next.

Amos

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux