Search squid archive

Re: Squid on steroids

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



If you're not caching at all and using reasonably modern hardware (e.g., dual core, ~3Ghz), you should be able to get somewhere between 2,000 and 4,000 requests a second out of a single squid process, depending on the average response size. YMMV, of course, and that doesn't count the overhead of the filtering, etc.

By 50,000 users, do you mean total (i.e., you have 50,000 customers), or 50,000 a day, or 50,000 concurrently, or...? Figuring out how much capacity you need is an inexact science, of course, but it's usually best to over-provision.

The hard part is going to be directing requests to the proxies, and handling failure well. I haven't done ISP proxy deployments in a long time, so I'll leave it to others to give you advice on that part. I'm assuming you'll want it to be transparent (e.g., use WCCP)?




On 18/06/2008, at 9:05 AM, ffredrixson@xxxxxxxxxxx wrote:

More broadband connections than anything else.

Possibly as many as 50,000 users.

No accelerator, maybe not even caching. Mostly to filter downloads, record websites, etc. maybe with something like urldb or Dansguardian.

Do you have ideas???

Thank you.


-------------- Original message ----------------------
From: Mark Nottingham <mnot@xxxxxxxxxxxxx>
What's your workload? E.g., is it going to be used as a proxy farm for
dialup users? Broadband? If so, how many? Or, is it for an
accelerator, and if so, how much content is there?

Cheers,


On 18/06/2008, at 5:07 AM, ffredrixson@xxxxxxxxxxx wrote:

I've been given a directive to build a squid farm on steroids.

Load balanced, multiple servers, etc.

I've been googling around and found some documentation but does
anyone have any direct experience with this?

Any suggestions?

Thank you in advance.

--
Mark Nottingham       mnot@xxxxxxxxxxxxx



More broadband connections than anything else.

Possibly as many as 50,000 users.

No accelerator, maybe not even caching. Mostly to filter downloads, record websites, etc. maybe with something like urldb or Dansguardian.

Do you have ideas???

Thank you.



--
Mark Nottingham       mnot@xxxxxxxxxxxxx



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux