Squid's url_regex is a hideously slow way of managing blackholed urls/sites/domains. I'm not necessarily blaming the program itself, the fact is, regular expressions can be quite computational. SquidGuard, on the other hand, is VERY fast and works quite well. Lots of folks around here swear by DansGuardian too. I personally didn't need the added features/complications that DansGuardian provide, so I stick with squidguard. <Semi-off-topic> Per a discussion with some of the admins at the North American Network Operators Group, I'm setting up a service to get automation into updating a centrally managed squidguard database. The hope is to get more users to contribute to it (making it more effective) and to also make it more portable so other services can use it. Fyi, if anyone is interested in contributing, please email me offlist. </semi-off-topic> Tim Rainier Information Services, Kalsec, INC trainier@xxxxxxxxxx Hendrik Voigtländer <hendrik@xxxxxxxxxxxxxxxxx> wrote on 10/10/2005 04:47:37 PM: > > >> > >> > > > > I set the full debug then checked my cache log. The slow down seems to > > be my acl > > for example > > acl noporn1 url_regex "/usr/local/etc/squid/noporn1" > > Which is a file i picked off the web that contains a list of porn sites > > about 44318 in total. Silly me :) > > So that is not the way to do that i will search out other methods. > > > > Cheers Terry > > I am using squidGuard with no performance impact at all to achieve > blacklisting. It comes with the distro I'm using (debian) and I didn't > bother to compile it by myself :-) > > Best regards, > > Hendrik Voigtländer