Re: Definitive way to aggregate bandwidth using multiple links

Linux Advanced Routing and Traffic Control

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, 26 Jul 2007 09:30:51 -0500
Grant Taylor <gtaylor@xxxxxxxxxxxxxxxxx> wrote:

> The only thing that comes to mind that would facilitate true
> aggregation of multiple links would be to have a server on very high
> bandwidth that you could create multiple tunnels (IPIP / IPSec / GRE)
> to and have it aggregate the multiple tunnels together and then use
> the aggregated tunnel as your larger pipe to the world and do all
> your NATing at that end so the world would see you from one largish
> connection.

	But his if you want to aggregate bandwidth to the world. Yes,
in theory it should work. But I want to agregate bandwidth just for me
(although it would be nice if the world could see one huge link).

	For external access, it would be nice (and simpler) if we had a
universal protocol which just tell the client "you can access this
server through IPs x, y and z" and the client would open connections to
x, y and z. There's program that does this, but you have to tell the
IPs to connect:

http://aria2.sourceforge.net/

"aria2 has  segmented downloading engine in its core. It can download
one file from multiple URLs"

http://wilmer.gaast.net/main.php/axel.html

	This support multiple servers too and can download a list of
mirrors...

-- 
Linux 2.6.22: Holy Dancing Manatees, Batman!
http://www.lastfm.pt/user/danielfraga
http://u-br.net

_______________________________________________
LARTC mailing list
LARTC@xxxxxxxxxxxxxxx
http://mailman.ds9a.nl/cgi-bin/mailman/listinfo/lartc

[Index of Archives]     [LARTC Home Page]     [Netfilter]     [Netfilter Development]     [Network Development]     [Bugtraq]     [GCC Help]     [Yosemite News]     [Linux Kernel]     [Fedora Users]
  Powered by Linux