Al Breingan wrote:
Hi - I am a new user of squid (version squid-2.5.STABLE12 as part of an
IPCOP 1.4.10 distro). I am connected to the net via a distressingly slow
connection, and would like to keep some fast changing files refreshed on the
proxy without manual intervention. Ideally this should all happen from the
single machine that acts as http and ftp proxy, as that is always left on.
Specifically I am trying to keep the newest versions of a set of GIF images
that show rain radar data. These images change every 10 minutes.
So ideally I am looking for a squid command that would say "retrieve this
URL every 10 minutes and store it in your cache", so that it can be
displayed without delay when a client machine requests it.
I suspect this may be a bit much to ask for (wry smile) and so another
possibility would be to use one of the FTP clients to request the files
using cron. However I am not sure if a request from the actual proxy machine
would be intercepted by the proxy daemon, or go directly out to the net.
This method also has the disadvantage of storing two sets of files on the
machine (one in the cache and one where the FTP client places them).
If anyone can give me any pointers about how I could best do this I would be
grateful.
Thanks in advance.
Al Breingan
export http_proxy=http://username:password@xxxxxxxxxxxx:3128/
wget -O /dev/null --input-file=/path/to/file/of/imageURLs
That should pass these requests through the proxy. The first line sets
the environment variable http_proxy that many clients (including wget)
use, the second line tells wget to grab all URLs listed in the file
imageURLs and stuff them straight in the bit bucket. Making sure the
images stay cached for the whole ten minutes is left as an exercise for
the reader.
An other option would be to have wget mirror the images, and have Squid
redirect requests for those images to the local copy. That would,
however, require running an HTTP server.
Chris