Search squid archive

Re: Distributed High Performance Squid

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Squid doesn't share memory or disk cache at the moment. It won't
share/slice filedescriptors the way you want them to.

I could probably write a "unified" logging hack so multiple squid
processes log to the same file via a single helper that handles
multiple pipes or something, one from each Squid. There's no atomic
"append a line" IO method in UNIX so doing it that way won't work.

You could try hacking things up to lock/unlock the file for each
logfile write but I have no idea what the impact would be.

Adrian


2009/8/20 Joel Ebrahimi <jebrahimi@xxxxxxxxx>:
> Hi,
>
> Im trying to build a high performance squid. The performance actually
> comes from the hardware without changes to the code base. I am a
> beginning user of squid so I figured I would ask the list for the
> best/different way of setting up this configuration.
>
> The architecture set up is like this: There are 12 cpu cores that each
> run an instance of squid, each of these 12 cores has access to the same
> disk space but not the same memory, each is its own instance of an OS
> and they can communicate on an internal network, there is a network
> processor that slices up sessions and can hand them off to any one of
> the 12 cores that is available, there is a single conf file and a single
> logging directory.
>
> The current problem I can see with this set up is that each of the 12
> instances of squid acts individually, therefore any one of them could
> try to access the same log file at the same time. Im not sure what
> impact this could cause with overwriting data.
>
> I actually have it set up this way now and it works well though it's a
> very small test environment and Im concerned issues may only pop up in
> larger environments when accessing the logs is very frequent.
>
> I was looking through some online materials and I saw there are other
> mechanisms for log formatting. The ones that I thought may be of use
> here are either the daemon or udp. There is actually a 13th core in the
> system that is used for management. I was wondering if setting up udp
> logging on this 13th core and having the 12 instances of squid send the
> log info over the internal network would work.
>
> Thought or better ideas? Problems with either of these scenarios?
>
>
> Thanks in advance,
>
> // Joel
>
> jebrahimi@xxxxxxxxx
>
> Joel Ebrahimi
> Solutions Engineer
> Bivio Networks
> 925.924.8681
> jebrahimi@xxxxxxxxx
>
>

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux