Search squid archive

squid and clamav

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi there!

I would like to scan web-traffic for viruses, filter access by ips and urls and cache the whole thing.

My setup at the moment:

client----------squid----------dansguardian-----squid----------web
-------------squidGuard---------clamav---------------------------

I am not happy about using two different squids. But the problem is: dansguardian needs a proxy to forward its requests to. But with dansguardian, I lose the information about source ip. This soucre ip is needed to apply client-specific access-rules. Therefor, the first squid is needed (with squidGurad).

Dansguardian suggests a solution: a patch called follow_xff (xff: x-forwarded-for) which would be applied to the 2. squid so this server could determine the real source-ip.

But the patches on http://devel.squid-cache.org/follow_xff/ apply only to two specific source-trees. What about actual builds? Is there another solution?

I like dansguardian, because it is coded in c (and not in slow perl / ..) but two squids, dansguardian etc. -> slow surfing!

Greets
jacusy

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux