Search squid archive

Re: Squid Processes

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 22.02.2012 03:15, Steve Tatlow wrote:
Hi,

We are running squid as a transparent proxy, with dansguardian doing the
content filtering. All traffic will be coming from localhost and no
authentication is required. Can someone tell me how I ensure there are
enough squid processes to support a large number of users (maybe 250
concurrent users)

None of us can tell you specific numbers. It is dependent on your hardware and client traffic.

The thing to be aware of is that measuring in users is meaningless. One user can flood the proxy, or some thousands could leave it idle waiting for more work. Capacities are reliably measured only in requests per second.


To get the details you seek measure and get some idea of how many requests per second those users make at peak times, and how many the whole structure is capable of handling. Each Squid series has a theoretical limit which is hardware dependant (3.1 can do about 800 req/sec on a dual core 2.2GHz CPU etc). The configuration specifics you create and type of requests the clients will reduce the capacity limit from there.

With content filtering you can usually expect only to reach 30% of Squids regular throughput due to the content processing overheads.




250 users is not large for Squid. Any of the production releases should be able to handle that many without causing much of a CPU bump on modern hardware. I think you can start with one Squid process and expand to more if you find it stressing the machines. More likely you will need more DansGuardian proxy processes though, that is where the heavy CPU consumption will occur.

Amos



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux