Hi
What I am trying to do is to simplify everything and remove
the external re-writers from the workflow due to the fact that
they are either old with sporadic development or wrap their
own lists into the solution.
I am also producing my own ACL lists for this project so
third party blacklists will not work for me.
Squid has a lot more smarts and is very active in
development so I think it would be a more complete robust
solution if I can get a handle on how it behaves when parsing
large ACL files.
My ACL's will be stored on a Ram based drive so speed there
should not be an issue.
Looking at the config samples at squidblackist.org, you
seem to pump massive ACL lists through the dstdomain acl so
maybe that is anecdotal evidence that this will work OK.
On 30/09/2016
1:43:33 AM, Benjamin E. Nichols
<webmaster@xxxxxxxxxxxxxxxxxx> wrote:
The other issue is that shalla and urlblacklist produce
garbage blacklists, and neither of them are actively
developing or improving the backend technology required to
product high quality blacklists.
We are the leading publisher of blacklists tailored for Web
Filtering Purposes.
We are also the only commercial source for Squid Native
ACL. Yes, we have it.
On 9/29/2016 4:44 AM, Darren
wrote:
Hi All
I have been tinkering with Squidguard for a while,
using it to manage ACL lists and time limits etc.
While it works OK, it's not in active development and
has it's issues.
What are the limitations with just pumping ACL lists
directly into Squid and letting it do all the work
internally without running a team of squidguards?
how efficient is squid now at parsing the text files
directly, will i Need more ram as the list grows? Is it
slower or are their optimizations that I can do?
thanks all
Darren Breeze
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users
--
--
Signed,
Benjamin E. Nichols
http://www.squidblacklist.org
1-405-397-1360 - Call Anytime.