We are working on a new project where we will distribute Linux machines in different areas that will be connected to the Internet. But these machines might not have a Internet connection always. We would like these machines to show certain web pages from a web server on a loop. For example I have 7 pages that jump to one to another after 7 - 10 seconds. But if the Internet connection goes down we want squid to keep showing the loop of HTML pages until the connection get restored and then squid could update the pages in the cache. I have tried to go through the documentation and see if I could get it all configured by myself. But I am having some issues still. The problem I am having is on one of the pages that has flash it is always trying to get the latest version. It does this as well for a couples of pages that have a graphic on them. If I unplug the internet connection to this machine and let it run through the loop it will always stop on the pages it likes to get content for and says in the browser (101) Error Network is Unreacable. I had thought that using the negative_ttl would stop that but I am not sure what to do. I have the following setup in my squid.conf file and I know I will have somethings wrong and if I could find out why it isn't caching the page or why it isn't using the cached page I would appreciate it. ######################################################################## http_port 80 cache_mem 64 MB maximum_object_size 8182 KB cache_dir ufs /cache 100 16 256 access_log /var/log/squid/access.log squid hosts_file /etc/hosts refresh_pattern . 14400 80% 43200 ignore-no-cache negative_ttl 720 minutes # 12 hours negative_dns_ttl 30 minute connect_timeout 45 seconds ######################################################################## Blake Grover IT Manager EZ-NetTools www.eznettools.com 800-627-4780 X2003 EZ-NetTools - We make it easy!