Search squid archive

Squid Cluster

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

I am in the process of setting up a two node cluster. Since I am
setting up Squid on the private side. I am  looking into how data is
stored. Basically I am trying to detemine what data can be shared
between instances of Squid. Say a host opens a connection to a remote
location and collects all kinds of state information and that squid
goes down. will that mean that all information is kept? Can a download
that is broken (due to the squid going down) halfway, still continue
while happily using the other (which has the same IP) Are there any
specific things I need to look in to?

I intend to set up a cluster consisting of the following
GlassFish
PostgreSQL
Nagios
Postfix
Squid
Heartbeat
Subversion
Named
DHCPd
TFTPd
Apache HTTPd
DRBD (dual primary with either GFS2 or OCFS2)
ldirectord or keepalived (all traffic is being balanced between the
two real servers and both nodes should be active)

It will probably run on either Gentoo or Centos x64

What are the important thing in regard to squid that I need to take
special attention to.
Things I can imagine
IPaddress sharing
Storage (cache) sharing
Synchronization of configuration
Sharing of logon data to squid (I intend to use form based
authentication for squid)

I am aware of the fact that the majority of this setup is in no way
relevant to squid, but it may impact it (I can not yet determine that)
I am especially interested in anything that is relevant to impact in
regard to availability, performance and load balancing

Any help is greatly appreciated!!

Regards,

Serge Fonville

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux