> Behalf Of Michael Thomas > This is more or less what I had in mind. Correct me if > I'm wrong, but http 1.0 wasn't the invention of the ietf, > but sprang forth outside of its purview. Http 1.1 was a > response to the many difficulties placed on the net because > of http 1.0, and there was an active feedback loop between > the http world and the net (ietf) world to adapt both at > layer 7 as well as below. Http, after all, was The Big Thing > for all parties, so it's not surprising that there was active > cross interest. True enough, the 1.0 draft was mostly developed before it was submitted to the IETF. But the differences between 1.0 and 1.1 are hardly save the net stuff. The only change that has been made that is really significant is the introduction of the host address header, and that was a unilateral action by Netscape without any external discussion of any kind. The rest of the 1.1 changes are mostly incremental. The chunking mechanism is an improvement and it makes keep-alive possible but that is an incremental improvement that was proposed independently of the IETF. The majority of the WG effort was spent perfecting the cache mechanism to work with proxies. Only these days we do not use client side proxies to any real extent - the exception being transparent firewall proxies. Most Web content is active with very short expiry times. So yes the WG effort was useful but it certainly did not 'save the Internet'. Nor was that ever going to be possible, Netscape made it clear that it would act unilaterally to introduce its own standards from day one. It was only after they ceased to be the dominant browser that they discussed proposed changes to their version of the spec before releasing code. It is arguable that things could have been improved if we started earlier. The digest authentication mechanism only exists because of the IETF attempt to eliminate en-clair transmission of passwords. Unfortunately very few web sites use it, almost all use the HTML form field instead. > What facinates me about p2p is that it was clearly the > next Big Thing, but there seems to be no feedback loop > operating whatsoever. I guess that surprises me. Thomas > brought up qos/diffserv and operator business models which is > certainly something ietf clue level could assist on, but it > seems that we neither know them, nor do they know us and that > both sets of people seem satisfied with that. I'm not saying > that it's bad -- it's just a very surprising outcome. Ought > both sides be that confident that the net as engineered today > is what it needs to be for this Big Thing and the Big Thing > after that? Or that our fertilization is really the correct > mix to prepare the ground for the next Big Thing? I think the problem you identify here is that the focus of the IAB is inward, looking at what the IETF is doing. It does not look outward to look at what the Internet community is doing. It is just assumed that they are the same thing. * We did not have a note from the IAB in 2000 saying 'hey this spam thing that was predicted as a possibility in 1982 is beginning to get really out of hand and become a criminal nuisance' * We have not had architectural guidance of the form 'stop pushing BEEP onto IETF working groups, the Web Services platforms have unanimously adopted SOAP and the WS-* stack' * We have not had information of the form 'phishing and social engineering are becoming major engines for Internet crime' * We have not had any real analysis of the botnet problem. _______________________________________________ Ietf@xxxxxxxx https://www1.ietf.org/mailman/listinfo/ietf