Definitive way to aggregate bandwidth using multiple links

Linux Advanced Routing and Traffic Control

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



	I always used multiple links from different ISPs and in my
oppinion the best way to really aggregate bandwidth is using some kind
of proxy which the client connects to and distribute multiple
connections to the links.

	Years ago, a friend of mine wrote Netsplitter:

http://www.hostname.org/netsplitter/

	but it's outdated, abandoned (last version from 2002). And it
was mainly written for FreeBSD but could run on Linux too.

	Another project which supposed to aggregate bandwidth was
eqlplus, which is outdated too:

http://www.technetra.com/solutions/eqlplus/

	Main Netsplitter advantages over eqlplus:

1) it doesn't require kernel patches, it runs completely in user space
2) it isn't restricted to serial lines (slip, uncompressed ppp).
Finally we can use our ethernet links :)
3) simpler configuration

	Anyway, I'd like to ask if somebody knows about some other
project similar to these. With netsplitter everything was so simple, I
redirect the connections to the netsplitter daemon, which acts like a
proxy, and opened multiple connections to a ftp/http/whatever server
and it distributed the connections over the links... very nice. This
way we don't have to mess with the kernel. The method is elegant and
transparent.

	Thanks!

-- 
Linux 2.6.22: Holy Dancing Manatees, Batman!
http://www.lastfm.pt/user/danielfraga
http://u-br.net
Alphaville - "Big in Japan" (First Harvest 1984-92)

_______________________________________________
LARTC mailing list
LARTC@xxxxxxxxxxxxxxx
http://mailman.ds9a.nl/cgi-bin/mailman/listinfo/lartc

[Index of Archives]     [LARTC Home Page]     [Netfilter]     [Netfilter Development]     [Network Development]     [Bugtraq]     [GCC Help]     [Yosemite News]     [Linux Kernel]     [Fedora Users]
  Powered by Linux