I'd like to come back to this point, and try a slightly different direction:
Fred Baker wrote:"The purpose of the IETF is to create high quality, relevant, and timely standards for the Internet."
I think I would state it in these words: "The Internet Engineering Task Force provides a forum for the discussion and development of white papers and specifications for the engineering issues of the Internet."
This seems like a reasonable characterization of the output of the IETF.
However, it doesn't seem to capture some of the scoping/delimiting that the original text did - does the IETF discuss any and all such issues? Is it trying to achieve anything in particular by documenting things? (How) can we detect that there are issues we should be discussing and can't?
(How) would you add to your text to provide some boundaries/guiding lines?
Thinking out loud here, plenty of room for all to chime in. The key differences, if there are any, between IETF and NANOG and her sisters, and between IETF and IRTF, are:
Operationally, IETF discussions address advice to operators (service provider and enterprise, the latter being a group that our operational friends sometimes seem to forget) from the individuals who participated in the discussion, as opposed to discussions among operators. The latter are perfectly welcome and do happen (ptomaine, v6ops), but I would characterize them as often more uniquely the domain of NANOG and her sisters.
From a research/innovation perspective, I would characterize the IETF as creating solutions that we in some sense know how to build, as opposed to playing with and learning about possible solutions that we are unsure how to properly build. When we built OSPF, to name one example, it was not from whole cloth; the algorithms were already well defined - we were simply figuring out how to use them. You can say quite accurately that we make our share of mistakes even in what we supposedly know how to do, but at least we can recognize when we do so. In research, 90% of ideas are truly bad ideas, and the other 10% are testing grounds for bits and pieces that will some day contribute to the solution. That is *normal*; research is *supposed* to be risky, to be "out of the box". When the researchers have done their job and the engineers come to build a solution for the Internet, the engineers rarely if ever simply adopt research proposals. But they are guided by the wisdom learned in that community, and if they lack that compass, they quickly are lost.
I am reminded of an academic researcher who once complained to me that "you write too many RFCs. You write the RFC, and we start our research on that RFC. We get part way into it, and you publish a new RFC." I replied to him that the idea is that he is supposed to do the research *before* I write the RFC, so that when I write the RFC I write it once and it is right. If he think he is doing the research afterwards, then in reality I am doing the research and the engineering together, in my customer's networks, and he is doing Quality Assurance.
We need the researchers, desperately, but not to do the engineering. We need them to tell us how, to be the pathfinders. We need the operational folks equally desperately. If nothing else, they are the canaries in the mine shaft, and they often can tell us what the mother lode looks like when we see it in the rough. And yes, we need the engineers that are paid by the vendors. Their marketing people are IMHO unwelcome, because they set one person against another for their company's gain, and a house divided against itself cannot stand. But if you think for a minute that the operators and the researchers can do this Internet thing without the products the vendors build, and the engineers that build them, you are sadly mistaken.
I am reminded of comments that I have heard in various parts of each of those organizations. Dave Clark, speaking to the Internet II Joint Techs last Tuesday, said that the IETF had forsaken innovation, and had been overtaken by the [evil] vendors. I submit that we - the academics, students, and researchers, the edge network operators that deliver applications running in a network, the transit network operators that deliver bandwidth to interconnect edges, and the vendors, whose products inhabit all of those networks - may not all meet together at the same time or with the same purpose, and we certainly all see things from different perspectives and in different ways. To that extent, perhaps he has a point. But we each play a part in the play.
Our strength and our value is not, however, that those of our particular stripe are somehow better than others, or more needed, or have a better level of understanding - that rock breaks scissors, that scissors cut paper, or that paper covers the rock. Our strength, rather, is that we bring our various perspectives together, as a builder brings together brick and mortar, the flexible strength of the board, the mass of the hammer, and the steel of the nail, and build something that will withstand the test of operational experience.
Let me try to say all that succinctly:
"The Internet Engineering Task Force provides a forum for the discussion and development of white papers and specifications for the engineering issues of the Internet. This discussion builds on hard lessons learned in research and operational environments, and necessarily includes speakers from those communities. Vendors offer wisdom on what can be built and made to work in their products, and may bring customer or market issues whose owners cannot or will not bring themselves.
The intended goal is well characterized as 'community memory' - written observations and wisdom as well as protocols and operational procedures defined - to enable the datagram internet to scalably deliver relevant services in transit and edge networks."
Is that on target? Is it too many words?
Attachment:
pgp00417.pgp
Description: PGP signature