Patrick O'Callaghan wrote:
On Mon, 2008-06-16 at 08:08 +0800, John Summerfield wrote:
I don't think either downloads in parallel, and if your internet is
running at its rated speed, that is likely the bottleneck do running
two, three or more downloads in parallel will serve only to choke
your
self. And waste server resources.
apt-get does run several downloads in parallel. This makes sense when
some servers can only server data at a rate lower than the connection
bandwidth, which does happen particularly with high-traffic sites.
I use apt-get regularly, I also have some Debian systems, and the
progress meter doesn't reflect parallel downloads.
If a remote (free!) server is already overloaded, adding to its stress
doesn't seem very sensible. It doesn't take a very large increase in
requests for a service to go from "very busy but coping" to "thrashing."
Just take a look at supermarket queues and think how well the are run,
and how they might be run better (from the customer's POV). I've rairely
seen an idle checkout operator.
If an operator can (on average) serve one customer per minute (and the
times don't vary much), and customers arrive at one per minute, there
won't be much of a queue. However, if customers arrive each 55 seconds,
it won't take long for the queue to go out the door, so to speak.
If the bottleneck isn't the server, but the network, then IP is designed
to discard packets when overloaded, and TCP manages this by detecting
discarded packets and requesting they be resent. An IP network is quite
resilient, but it can be flooded and this is what use of parallel
downloads does.
--
Cheers
John
-- spambait
1aaaaaaa@xxxxxxxxxxxxxxxx Z1aaaaaaa@xxxxxxxxxxxxxxxx
-- Advice
http://webfoot.com/advice/email.top.php
http://www.catb.org/~esr/faqs/smart-questions.html
http://support.microsoft.com/kb/555375
You cannot reply off-list:-)
--
fedora-test-list mailing list
fedora-test-list@xxxxxxxxxx
To unsubscribe:
https://www.redhat.com/mailman/listinfo/fedora-test-list