Search squid archive

Caching content on private servers

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi -

I'm trying to figure out whether I'm even going down the right path
here - whether Squid is the right tool for the job.

I'm writing a PHP web app that queries data from one of our databases.
One bit of information returned by the query is the url to an image
file, such as http://192.168.99.2/images/foo.jpg .  The images are
stored on any of three servers, at three separate physical locations.
Each of the locations is connected back to my office via T1s.  The web
server, in the DMZ port on my firewall, is at my office.

I was thinking of putting a squid server in the dmz - either on it's
own box, or on the box I'm running the PHP app on, to pull those
images, cache them & serve them to clients accessing the site.  The
problem is that none of the servers hosting the images are internet
accessible, nor will they ever be.  I also want to mimize traffic
across the T1s.

Is squid capable of doing this?  If so, can any of you provide
information on how to go about getting started?  I've never dealt with
squid before & haven't really found any documentation on my own which
tells me how to configure this sort of setup.  Everything I've read
has been about configuring a proxy for people browsing the web.

If there's a better way to do this, I'd be open to hearing about it too.

Many thanks!
Mike

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux