Search squid archive

Re: Bad url sites

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



mån 2009-10-12 klockan 23:12 -0400 skrev Ross Kovelman: 
> I use a file called bad_url.squid to represent sites I want blocked.  I
> think I have reached a limit to what it can hold as when I do a reconfigure
> it could take a few minutes for the data to be scanned and processing power
> gets sucked up. 

What kind of acl are you using?

Tested Squid-2 with dstdomain acls in the order of 100K entries some
time ago and took just some seconds on my 4 year old desktop..

Did a test again on the same box, this time with a list of 2.4M domains.
Total parsing time is then 55 seconds with Squid-3 or 49 seconds with
Squid-2.

Regards
Henrik


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux