Search squid archive

Re: Large text ACL lists

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Well, forgive me for bad mouthing the developers here, but I think this is a good reason.

You see, you are going to have to eliminate all the redundant subdomains in your blacklists, because they are going to crash modern versions of squid. And to do this I would recommend using an older version of Squid for your blacklist validation purposes, because, a few years ago, The developers decided it was a good idea to stop throwing errors in the logs when there is a duplicate entry in the blacklists, you know, the way squid used to be, I have no idea who is smoking hashish over there and making these idiotic decisions, because clearly, it would be better to actually have something in your error log indicating where the problem is, rather than just having squid shit on itself and have zero indication of how or why it happened, but again, the hashish must be cheap because the latest versions of squid will do just that, shit on themselves and give you zero indication of why or where the problems in your acl lists are.

And yes Squid Native ACL Blacklisting does work, but we are also, unlike the competitors, actually removing dead domains daily, to minimize wasteful bulk, in other words, we are actually doing our job rather than just boasting about line counts with 50% dead domains that really should be removed to make the list size the most efficient.

We also offer the lists in other various formats to ensure maximum compatibility.
I would love to share thoughts with you regarding the matter.


On 9/29/2016 4:29 PM, Darren wrote:
Hi

What I am trying to do is to simplify everything and remove the external re-writers from the workflow due to the fact that they are either old with sporadic development or wrap their own lists into the solution.

I am also producing my own ACL lists for this project so third party blacklists will not work for me. 

Squid has a lot more smarts and is very active in development so I think it would be a more complete robust solution if I can get a handle on how it behaves when parsing large ACL files.

My ACL's will be stored on a Ram based drive so speed there should not be an issue.

Looking at the config samples at squidblackist.org, you seem to pump massive ACL lists through the dstdomain acl so maybe that is anecdotal evidence that this will work OK.

Darren B.

Sent from Mailbird

On 30/09/2016 1:43:33 AM, Benjamin E. Nichols <webmaster@xxxxxxxxxxxxxxxxxx> wrote:

The other issue is that shalla and urlblacklist produce garbage blacklists, and neither of them are actively developing or improving the backend technology required to product high quality blacklists.

We are the leading publisher of blacklists tailored for Web Filtering Purposes.

We are also the only commercial source for Squid Native ACL. Yes, we have it.


On 9/29/2016 4:44 AM, Darren wrote:
Hi All

I have been tinkering with Squidguard for a while, using it to manage ACL lists and time limits etc.

While it works OK, it's not in active development and has it's issues.

What are the limitations with just pumping ACL lists directly into Squid and letting it do all the work internally without running a team of squidguards?

how efficient is squid now at parsing the text files directly, will i Need more ram as the list grows? Is it slower or are their optimizations that I can do?

thanks all

Darren Breeze





Sent from Mailbird


_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users

-- 
--

Signed,

Benjamin E. Nichols
http://www.squidblacklist.org

1-405-397-1360 - Call Anytime.

-- 
--

Signed,

Benjamin E. Nichols
http://www.squidblacklist.org

1-405-397-1360 - Call Anytime.
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux