Search squid archive

Re: what are the Pros and cons filtering urls using squid.conf?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 6/9/2013 3:52 PM, Marcus Kool wrote:
I do not understand the performance figure.  Can you give more details ?

Best regards,
Marcus
Yes indeed.
The performance of an ICAP service is in another level from helper since it has concurrency capability in it.
one request doesn't affect another one.
one request is being handled while others are handled at the same time.
My small ICAP service proved to work with 8k requests being HANDLED(filtered is another story about blocking) per second on an intel atom cpu which is suppose to take much lower load.

The above proved that INTEL ATOM is a capable system and also proved that squid can handle a lot of requests per second on a very slow CPU which states that on a faster CPU with SMP capabilities can take much higher load then people actually state.

ufdbGuard is a nice product but if it reloads the database it means that the filter is not real time and cannot work for a strict system which secure from malicious software but can be OK for a more failure tolerance users. the basic rule for mental institution for example must be "first block all" and then allow even in a time when there is a reload of the DB.
or in cases of army or any HIGH and SECURED institution.

when talking about kids you dont want them to see pictures of naked and hot girls even by accident.

When you can handle concurrent requests you dont need to think about the load per second but concurrent load on the service.

I will write more but later.
ask me anything

Eliezer




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux