Search squid archive

Re: what are the Pros and cons filtering urls using squid.conf?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 6/9/2013 6:59 PM, Alex Rousskov wrote:
On 06/09/2013 03:29 AM, Eliezer Croitoru wrote:

Would you prefer a filtering based on a reload or a persistent DB like
mongoDB or tokyo tyrant?

I would prefer to improve Squid so that reconfiguration has no
disrupting effects on traffic, eliminating the "reload is disruptive for
Squid but not for my ICAP service" difference.

There are many important differences between ACL lists, eCAP adapters,
and ICAP services. Reconfiguration handling should not be one of them.


Cheers,

Alex.

So our aim is to improve squid reload!
perfect.
This is what I do want entirely.
The main issue is that static squid.conf cannot comply with the demand to allow DB update on the fly.

If it would be possible squid will move forward to a very very good point in the development.
The above is a good point in every software.

I will give you the given scenario:
Filtering solution(not sure 100% if it should be based on squid)
Human based filtering DB of pictures domains and pages
A very strict client that wants on the fly filtering(one client allow first and block later the second is block first and allow later)
In this scenario we have rating of -128 and +128(int32)
the light filtering will be -51 which allow first and later disallow. 0 should block first and then allow after human or computer inspection

The above scenario is a real world scenario which my friend developed and designed a proxy and other helpers to make our children world a cleaner world. I myself not really a fan of "kids shouldn't see" etc but I do understand why people do want it and make big efforts to make it happen.

What do you think about the idea?

Eliezer





[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux