Search squid archive

Is squid-cache the right tool?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I have been using squid-cache at home on my firewall for years now, I
use it for the normal standard old stuff of simply caching of where we
surf.

I am writing a new kiosk based software package that has a GUI app as
the front end for the operator and apache is serving up the pages to
the web browser clients.  What is being served up are images.  The
images that come into the GUI app are full size images, 4 megapixel on
up.

Current what I am doing is after my software copies the full size
images into the computer, it then creates the two different web
images, one is a small thumbnail (120x180) the other is a larger image
for the screen (400x600).  When one is doing this to 200 images at one
time, it takes a while, too long in my opinion.

My first though was to have the indexing page detect if the smaller
images where there and create them, page by page, and then save the
smaller image so that next time it was snappy.  Then it dawned on me:
Isn't that was things like squid-cache do?  Cache these processed
files?

So the question is:  Is squid-cache (on Windows) the right tool to
cache these images?  I know that apache can be setup as a cache, but I
don't know anything about that.  Will I be better off using apache?

The other question I have to ask someone, more myself then anyone...
Am I making this too complicated by adding a proxy along with the web
server?

Thoughts and opinions?

--
The contents of this e-mail are intended for the named addressee only.
It contains information that may be confidential. Unless you are the
named addressee or an authorized designee, you may not copy or use it,
or disclose it to anyone else. If you received it in error please
notify us immediately and then destroy it.

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux