Search squid archive

Re: Peering caches (squid and 3rd parties) - How to

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hey Amos,

I am unsure about one thing.
in a case of carp array the related documents are:
- http://etutorials.org/Server+Administration/Squid.+The+definitive+guide/Chapter+10.+Talking+to+Other+Squids/10.9+Cache+Array+Routing+Protocol/
- http://docs.huihoo.com/gnu_linux/squid/html/x2398.html
- http://wiki.squid-cache.org/Features/LoadBalance#CARP_:_Cache_Array_Routing_Protocol

His case is dynamic urls to the same content.
let say 10 urls of the same youtube video are not guaranteed to go to the same cache.
This is why using ICP or HTCP comes handy.
We dont need to know the request HASH in order to get a cached object but the whole array can rely on each other ICP capabilities.
What do you think?

Eliezer

On 6/12/2013 9:33 AM, Amos Jeffries wrote:
The simplest thing you can do with your existing proxies is to set them
up into a CARP installation.

With the CARP design you have two layers of proxies:
- layer 1 is the Squid acting as gateways between clients and wherever
the data comes from.
- layer 2 is the HAARPCACHE proxies acting as caches for that specific
content.


To change you current configuration into a CARP system all you need to
do is:

1) make all HAARP proxies listen on IP:port which are accessible from
any of the Squid.

2) add a cache_peer line to each Squid.conf pointing at each HAARP proxy.
+ Using the "carp" option on every one of these cache_peer lines.
+ Use the same cache_peer_access ACL setup that you have now, but for
everyone of those new cache_peer as well.

3) ensure each of your Squid always have identical cache_peer settings.
- you can do that by writing all the HAARP related settings into a
separate file which is mirrored between the Squid and using squid.conf
include directive to load it.

After this no matter which squid receives the client request it will
hash the URL and point it at the HAARP proxy which is most likely to
have already cached it.

Done.


Extra notes:

* in a textbook CARP array all requests go to the parent caches - this
is optional. In your case only the HAARP URLs will go there.

* in a textbook CARP array the frontend does not cache - this is optional.

Amos





[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux