Re: Purpose of IESG Review

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 15/04/13 15:45, Brian E Carpenter wrote:
On 15/04/2013 15:23, Ted Lemon wrote:

...
So in practice, although I feel great sympathy for this position, I think it's mistaken.   I want the other ADs to comment on anything that they notice that looks like a problem.
There's an important class of problem that can only be found by someone
who is *not* a specialist - that is to say wording that's perfectly
clear and unambiguous to someone very familiar with the topic, but is
quite unclear to someone who isn't. This matters because we (presumably)
want our specifications to be useful to people who are implementing or
deploying them without already being members of the inner circle.

Just to be specific, here is a piece of text that came out of a WG
not so long ago. I have bowdlerised it:

"The Foobar standards [RFCxxx], [RFCyyy] provide useful generic
functionality like blah, blah and blah for reducing the overhead in
Boofar networks.  This functionality can be partly applied to Bleep."

That was it - a third party implementing Bleep was apparently supposed
to guess which bits of those RFCs applied where.

This led to a DISCUSS and seven months of delay before that "partly"
was disambiguated. Was that inappropriate out-of-area review?

    Brian
On 15/04/13 14:09, Jari Arkko wrote:
[Responding to Dave Crocker]
But what is tiring about this line of justification is its continuing failure to balance the analysis by looking at the costs and problems that come with it.  Does it provide regular and sufficient benefit to justify its considerable costs?
Agreed.
I would say it usually does justify its costs - skimping on the QA is almost always a bad idea. Speaking also from a gen-art perspective, getting a level of clarity and ease of use into a document may take a little while for a number of reviewers and authors, but the costs and benefits need to include the effort of implementers, testers and future users who will not appreciate getting a product that doesn't interoperate because the spec was unclear or they missed a point that might have been 'obvious' to the original authors. The size of the affected population is much bigger after publication (at least if anybody actually cares about the RFC).

I have to say that in my experience the seven months of delay that Brian cites is rather unusual. For the vast majority of documents, the sorts of clarification needed can be sorted out in a couple of rounds of email, and the results are unlikely to require recycling back to the WG. That being said, there certainly are documents where I wonder why anybody seriously thought the document was ready for publication; trying to fix the process 'tail heaviness' (as Jari describes it) would help if we could find a way. It could be argued that the tail process is just too polite. Once a document embarks on the publication process we seemingly inevitably send it through all those reviews resulting in multitudinous politely phrased DISCUSSes and comments when what it really needs is a blunt refusal followed by some competent editing.

/Elwyn




[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Fedora Users]