Re: BitTorrent (Was: Re: [Isms] ISMS charter broken- onus should be on WG to fix it)

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Paul Hoffman wrote:
At 1:50 PM -0700 9/15/05, Michael Thomas wrote:
Which is pretty much the elephant in the room, I'd say. How
much of the net traffic these days is, essentially, not in
any way standardized, and in fact probably considers ietf
old and in the way?


Not sure why this is an elephant; who cares?

  I'm not sure; maybe it's really a mutual non-admiration
  society, and everybody's happy? But it's an elephant
  insofar as it's pretty darn big trafficwise, and the
  fact that ietf doesn't seem concerned?

I have seen numbers that show that a huge percentage of traffic is P2P of various flavors, but I haven't seen anyone saying that this is having any negative effects.

  I don't think this is _entirely_ true: p2p stuff definitely
  has, um, interesting effects on, say, voip at home, and some
  of the p2p apps -- especially the earlier ones if I recall
  correctly -- had some pretty nasty effects on various networks.

  Are we to believe that they are largely self-healing problems
  as bad p2p apps will eventually correct themselves since it's
  in their interest? Is it reasonable to believe that there is
  enough general clue that they could be expected to do that?
  And the collective clue of the ietf is not really needed to
  help this along?

I'll note that many protocols -- good and bad -- spring from
somebody's head. Some of them become successful too. Very
successful. And ietf has no say about them at all. Is this
the new reality?


But for layer 7 protocols, file sharing may be the only major market that has wholly ignored the IETF.

  This isn't that unusual really, but what facinates me
  is that the reverse seems true as well.

Yes, if one that has bad congestion control becomes popular. But, given the mindshare of BitTorrent these past few years, that seems pretty unlikely.

  But surely BitTorrent isn't the last one that will come
  along. I guess the base question is this: is the net robust
  enough to really allow experimentation with flash crowds of
  millions of alpha testers? So far it has, but we're layering
  more and more stuff onto the net too -- like voip -- that
  are pretty sensitive to average expectations (I'm thinking
  about things like Vonage, not managed services). Is that
  a danger for the overall internet architecture? That is, is
  there a price for this benign neglect that has yet to surface?

		Mike

_______________________________________________

Ietf@xxxxxxxx
https://www1.ietf.org/mailman/listinfo/ietf

[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Fedora Users]