Re: "why I quit writing internet standards"

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Apr 17, 2014:10:30 AM, at 10:30 AM, Benoit Claise <bclaise@xxxxxxxxx> wrote:

> On 17/04/2014 02:28, Thomas Nadeau wrote:
>> On Apr 14, 2014:12:07 PM, at 12:07 PM, Alia Atlas <akatlas@xxxxxxxxx> wrote:
>> 
>>> On Mon, Apr 14, 2014 at 11:57 AM, David Meyer <dmm@xxxxxxxxx> wrote:
>>>> On Mon, Apr 14, 2014 at 8:08 AM, George, Wes <wesley.george@xxxxxxxxxxx> wrote:
>>>>> I’m surprised that no one has sent this out yet:
>>>>> http://gigaom.com/2014/04/12/why-i-quit-writing-internet-standards/
>>>>> 
>>>>> "Summary: After contributing to standards organizations for more than seven
>>>>> years, engineer Vidya Narayanan decided it was time to move on. Although she
>>>>> still believes that these organizations make the Internet a better place,
>>>>> she wonders about the pace of change versus the pace of organizations."
>>>>> 
>>>>> My thoughts-
>>>>> 
>>>>> There are some nuggets of truth in what she says in this article, and in
>>>>> some of the comments. I think that the problems are real, so there’s value
>>>>> in taking the criticism constructively, despite the fact that the author
>>>>> chose to focus on the problems without any suggestions of solutions.
>>>>> 
>>>>> "while the pace at which standards are written hasn’t changed in many years,
>>>>> the pace at which the real world adopts software has become orders of
>>>>> magnitude faster."
>>>>> …
>>>>> "Running code and rough consensus, the motto of the IETF, used to be
>>>>> realizable at some point. … In the name of consensus, we debate frivolous
>>>>> details forever. In the name of patents, we never finish.”
>>>>> …
>>>>> "Unless these standards organizations make radical shifts towards
>>>>> practicality, their relevance will soon be questionable.”
>>>>> 
>>>>> I don’t have too many big ideas how to fix these problems, but I’ll at least
>>>>> take a crack at it in order to spur discussion. My paraphrase of the problem
>>>>> and some discussion follows.
>>>>> 
>>>>> - We’ve lost sight of consensus and are too often derailed by a vocal
>>>>> minority of those willing to endlessly debate a point.
>>>>> 
>>>>> Part of the solution to that is reiterating what consensus is and is not,
>>>>> such as draft-resnick-on-consensus so that we don’t confuse a need for
>>>>> consensus with a need for unanimity. Part of the solution is IETF leadership
>>>>> helping to identify when we have rough consensus encumbered by a debate that
>>>>> will never resolve itself, without quieting actual disagreement that needs
>>>>> continued discussion in order to find a compromise. I don’t have good
>>>>> suggestions on how to make that second half better.
>>>>> 
>>>>> - We don’t have nearly enough focus on running code as the thing that helps
>>>>> to ensure that we’re using our limited cycles on getting the right things
>>>>> out expediently, and either getting the design right the first time, or
>>>>> failing quickly and iterating to improve
>>>>> 
>>>>> The solution here may be that we need to be much more aggressive at
>>>>> expecting any standards track documents to have running code much earlier in
>>>>> the process. The other part of that is to renew our focus on actual interop
>>>>> standards work, probably by charter or in-group feedback, shift focus away
>>>>> from BCP and info documents. Perhaps when considering whether to proceed
>>>>> with a given document, we need test as to whether it’s actively
>>>>> helpful/needed and ensure that we know what audience would be looking at it,
>>>>> rather than simply ensuring that it is “not harmful” and mostly within the
>>>>> WG’s chartered focus.
>>>> My friend @colin_dixon pointed this out to me yesterday, and I've been
>>>> giving it quite a bit of thought since then (I have a nascent blog on
>>>> the topic of how open source and standards orgs might
>>>> productively/efficiently work together; follow up to
>>>> http://www.sdncentral.com/education/david-meyer-reflections-opendaylight-open-source-project-brocade/2014/03).
>>>> 
>>>> What I can say is that after seeing the kind of progress that several
>>>> open source communities make (they do epitomize the best of the IETF's
>>>> running code/rough consensus ethic), one does have to wonder if
>>>> traditional standards making is either obsolete or in dire need of a
>>>> make over. What is needed, IMO, is a reimagining of how the standards
>>>> process interacts with the open source movement specifically focused
>>>> on how they can compliment one another.
>>> [Alia] It would be very useful to have a functional model for how the
>>> two can compliment each other.  We also tend to talk about open-source
>>> as a single monolith - when it can have very different models for
>>> accepting in changes, how and who runs the community, who is really
>>> participating (open source doesn't mean non-corporate) etc.   Some of
>>> what the IETF does is the architecture and requirements thinking about
>>> how the solution should fit in - while some of the open-source is
>>> about getting a solution implemented ASAP.   IMHO, a spiral is useful
>>> with an easy way of interaction.  With I2RS, as a WG chair, I
>>> suggested having experimental drafts describing solutions that were
>>> being implemented - but haven't seen any.   A question is what is
>>> needed to encourage the interactions.
>> 	I think this is where things (start to) go wrong with the existing IETF model. We need drafts of everything is an axiom here that we should challenge. In open source, the code, its associated wiki pages and perhaps auto-gen docs are the draft and implementation agreement/documentation in question.
>> 
>> 	I would also challenge what has become the common new WG "formula" these days. That is the notion that requirements, problem statements or architectures are always needed and must come before the WG can work on any protocols/solutions. These are heavy burdens that have a price of slowing things down. If we want to pay that price, we'd better get something good in return. However, the road is littered with many examples of recent WGs getting nowhere after several years of work. These groups often end up in this state because frankly many of the people involved in those efforts are completely disconnected from the reality of building the things they are apparently designing. In many cases they never have built anything either, so much of the effort goes into explaining to them why things need to be a certain way/etc... Unfortunately their "veto" or influence in the process also dilutes the simplicity that an implementation would reveal (i.e.: useless options).  Its clearly the difference between theory and practice, which I think we all understand is a real thing.   The end result is that this impedance is one quantifiable reason why people have looked to over avenues such as open source, to build interoperable systems/tools/etc... quickly.
>> 
>> 	The solution: join the IETF process to the open source communities *where it makes sense, but not otherwise*. Open source BTW is what we used to call a rough-concensus + running code around here!
>> If we stick to interfaces that need "on the wire" interoperability, we are in the sweet spot. If we wade into areas where there is little value added - such as requirements/problem statements/architectures, lets not waste our time.
>> For example, there are clearly cases where an RFC standard for something does not makes sense. Something like Open stack is a good example. The APIs are there, implemented in code you can see and documented. There is a well defined change process that identifies stable release points and also allows for quick iteration on that code/interfaces to evolve that works very quickly. Why would anyone every need an RFC for these things?  Now there really are cases where formal standardization makes sense. As I said if we stick to interfaces that need "on the wire" interop, we are squarely somewhere both open source and the IETF can add value. For example, there is a lot of code generation in open source based on yang models these days. Many of the experts around the model creation are at the IETF and not necessarily in those communities, so joining those folks to model creation can help that running code. In cases where vendors really want interop at the model level, then lets get together the vendors/operators that care, and not only build running code to test/prove those models make sense, but then run them through the IETF process in parallel.
> If a WG can't agree on which problem(s) it want to solve, then there is a problem, no?
> And not sure if Open Source is the solution, unless everybody says: "here is my view of the problem, let me provide some code for it"

	That is precisely how things work in open source; people huddle together to solve a problem and write code to solve it. There is not much "management" needed here, nor are there any hard requirements to write problem statements, architectures, frameworks, etc.. before that process of code construction can begin.  That is what I am getting at: we need to get back to writing more code around here. Code that is used to build some goal. That is ultimately what defines what can be done.  As we iterate though that process, we arrive at the goal. 

> Note: I've been involved in a couple of WGs where defining a terminology was a time-consuming but well-worth effort, in order to produce the requirements/problem statement and architecture document. We're all engineers, and the fun part is to design protocols or write code.
> However, I agree that requirements/problem statement/architecture RFCs should be produced way faster within the IETF. And maybe, the IESG should be stricter regarding deadlines for problem statements: if the WG can't agree on the problem statement by date D, the WG is closed.

	Its important to understand what needs to be done but not necessarily how in too much detail ahead of time before real work begins; that is what is broken here in the process - we wait for literally years before we can do any real work. The IETF's process is analogous to waterfall software design and open source is more like agile development.  Anyone skilled in the art of modern software engineering knows the trade-offs between these two approaches, as well as what is used by modern software engineering done on a large scale.

	--Tom


> 
> 
> Regards, Benoit
> 
>> 
>> 	--Tom
>> 
>> 
>> 
>>> [Alia] Diversity of implementation is important as is stability of a
>>> standard and it being understood how to change/upgrade for different
>>> versions.  These don't come automatically via open-source.
>>> 
>>> Regards,
>>> Alia
>>> 
>>> 
> 
> 
> 

Attachment: signature.asc
Description: Message signed with OpenPGP using GPGMail


[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Fedora Users]