Search squid archive

Allow specific large files by URL

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Dear Group,

	I have a basic 4MB limit on cache file size. This is about right for
my needs. However, there is a group of about 5 URLs that keep cropping up
that are downloads of larger files - normally software updates, virus
definitions, that sort of thing. I want to cache them regardless of their
size, but I don't want to cache anything else that it more than 4Mb.

Can I set exceptions to the Maximum File Size rules for specific URLs?
Actually, specific domain paths...?

My only other thought was a kind of transparent redirection (perhaps in the
iptables) to some other local server and then download these files in a cron
job. However, this is a little bit like re-inventing the wheel just because
you want a blue one. (shameless "Hitchhikers guide to the Galaxy" reference)

Any suggestions?

My thanks in advance,


Ben Hathaway
Software Developer
http://www.spidersat.net



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux