Search squid archive

Re: what are the Pros and cons filtering urls using squid.conf?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 





On 06/09/2013 10:06 AM, Eliezer Croitoru wrote:
On 6/9/2013 3:52 PM, Marcus Kool wrote:
I do not understand the performance figure.  Can you give more details ?

Best regards,
Marcus
Yes indeed.
The performance of an ICAP service is in another level from helper since it has concurrency capability in it.
one request doesn't affect another one.
one request is being handled while others are handled at the same time.
My small ICAP service proved to work with 8k requests being HANDLED(filtered is another story about blocking) per second on an intel atom cpu which is suppose to take much lower load.

Do I understand it correctly that this is a "dummy" configuration replying only "204 allow" ?

The above proved that INTEL ATOM is a capable system and also proved that squid can handle a lot of requests per second on a very slow CPU which states that on a faster CPU with SMP capabilities can
take much higher load then people actually state.

ufdbGuard is a nice product but if it reloads the database it means that the filter is not real time and cannot work for a strict system which secure from malicious software but can be OK for a more
failure tolerance users.
the basic rule for mental institution for example must be "first block all" and then allow even in a time when there is a reload of the DB.
or in cases of army or any HIGH and SECURED institution.

What means "filter is not real time" to you?
What is the requirement that you have in mind that the perfect URL filter should comply with ?
ufdbGuard can easily be configured to "first block all".

Do you want to block malicious software with a URL filter?
Isn't that a job for antivirus software?

As a side note: there does not exist a perfect URL filter that blocks everything that you want it to block since there is always a race between new types of filter circumvention techniques and the countermeasures. Even with a first-block-all and allow only whitelisted URLs, one cannot safely assume that a malicious user does not have access to web sites that you want to be blocked.

when talking about kids you dont want them to see pictures of naked and hot girls even by accident.

When you can handle concurrent requests you dont need to think about the load per second but concurrent load on the service.

Please explain "concurrent load on the service" ?

Marcus

I will write more but later.
ask me anything

Eliezer






[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux