Search squid archive

Re: Bad url sites

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> From: Henrik Nordstrom <henrik@xxxxxxxxxxxxxxxxxxx>
> Date: Tue, 13 Oct 2009 12:54:30 +0200
> To: Ross Kovelman <rkovelman@xxxxxxxxxxxxxxxx>
> Cc: "squid-users@xxxxxxxxxxxxxxx" <squid-users@xxxxxxxxxxxxxxx>
> Subject: Re:  Bad url sites
> 
> mån 2009-10-12 klockan 23:12 -0400 skrev Ross Kovelman:
>> I use a file called bad_url.squid to represent sites I want blocked.  I
>> think I have reached a limit to what it can hold as when I do a reconfigure
>> it could take a few minutes for the data to be scanned and processing power
>> gets sucked up. 
> 
> What kind of acl are you using?
> 
> Tested Squid-2 with dstdomain acls in the order of 100K entries some
> time ago and took just some seconds on my 4 year old desktop..
> 
> Did a test again on the same box, this time with a list of 2.4M domains.
> Total parsing time is then 55 seconds with Squid-3 or 49 seconds with
> Squid-2.
> 
> Regards
> Henrik
> 

This is what I have:
acl bad_url dstdomain "/usr/local/squid/etc/bad-sites.squid"
Should I use something else?  Is this the best way?  I am on squid 2.7

Thanks

<<attachment: smime.p7s>>


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux