Mike Makowski wrote:
Hello all, I'm new to squid. Currently have squid3 installed on a 64-bit ubuntu box. Fresh install with default YUM settings. I am trying to get the forward proxy cache to work for one very specific application that is running on multiple servers on my LAN. Instead of each machine pulling a 12MB file from the internet I am trying to make squid pull the file on behalf of the clients and then server it to them out of cache. The file changes roughly every 15-20 minutes and squid would need to pull the fresh copy each time and update the cache. The clients are using wget to download the 12MB file via http. set http_proxy = '172.16.0.2:3128 (squid server) wget http://www.sortmonster.net/master/Updates/test.xyz -O test.new.gz --header=Accept-Encoding:gzip --http-user=myuserid --http-passwd=mypassword
Content protected by a login is usually marked with the header "Cache-control: private", which will prevent a shared cache from keeping a copy.
I have changed the ACL to accept connections from my 172 network and have also updated http proxy to recognize the same addresses. Have increased the max file cache size to 200MB and have increased the maximum individual file cache object size to 50MB. The proxy works great in that it is servering the requested content to each client but it always pulls over the interet - not from cache. I have been told that others have been successful at caching this file in squid so I suspect there is nothing wrong on the remote end. I'm also not quite clear how squid will handle the requests if multiple clients request the file at the same time and it is not yet cached. I know I'm missing something very simple. Suggestions please.
http://www.squid-cache.org/Versions/v3/3.0/cfgman/refresh_pattern.html Keyword "ignore-private".
Thank for any help. Mike Makowski
Chris