Search squid archive

Squid sizing for url filtering and lots of users

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi all.

Maybe this seems the already-asked question about Squid sizing. I had a look
at the faq, searched the list archives (and google) but I did not find a
satisfying answer.
I have some experience as a Linux admin, with some Squid installations, but
only for small sites. Now I was asked to propose a squid based solution for
url filtering of the web traffic of 12.000 users. The speed of the Internet
connection is 54Mbit/sec. Unfortunately at the moment I am not given the
amount of http requests per second, but I suppose that web surfing is not
the main business of these users, so they are not going to use all that
bandwidth with http.
I was asked if all this can be done with just a cluster made of 2 machines
for availability (which would be appreciated, since tha main point seems to
be url filtering, not necessarily to save bandwidth), or if it is mandatory
to implement a cache hierarchy.
I thought about some scenarios. In the worst one I assumed I need 400 Gbyte
of storage for cache and about 10 Gbyte of RAM. I would like know if it is
possibile (and safe) to run such a Squid machine. In particular, I wonder if
I'm going to run out of file descryptors, available TCP ports, or there are
other constraints I should think about. Or if maybe I should better consider
splitting the load on a set of different machines.

Thank you

Luigi


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux