The working group concerned adopted a working practice of creating test cases for any significant decision that it was required to make. One of the observations that underpinned this approach was that, given text specifying some awkward technical detail, it was usually possible for reasonable implementers to interpret it differently. Not so for test cases.
The result was a process analogous to test-led software development, in which the specific test cases (concerning which it was far easier to unambiguously determine WG consensus) were used to drive the adoption of new or changed specification text. A side effect of all this was that we ended up with a suite of test cases that could be used to judge the extent to which the specification was consistently implementable.
With these test cases available along with the (purported) complete specification, and compared with the long delay for IETF specs to move from proposed to draft, the process for moving from [equivalent of] Proposed to [equivalent of] Draft is relatively short ... typically a few months, it seems.
With this process, a substantial element of the particular "hardest part" that Keith notes is put together by the working group as the specification is being developed.
#g --
At 13:26 08/03/04 -0500, Keith Moore wrote:
It's all well and good to try to retire Proposed Standard documents that don't get implemented. But I think it's even more important to make it easier for documents that do meet the criteria to advance to Draft Standard. In my experience the hardest part of getting a document advanced is to collect the implementation report.
Hence this modest proposal:
- For each standards-track document, create a web page that is used to keep track of bug reports, errata, implementation reports, and test reports. (yes, I know about the RFC Editor's errata page - this might be a modification of that or it might be something else entirely)
- Allow implementors to submit reports via a form on that web page - An implementation report would name an implementation and specify what features it implemented - A test report would, for a given set of implementations, specify which features were tested and whether they interoperated
- Allow ADs to designate one or more people to review implementation reports (to eliminate duplicates and cull out bogus reports)
- At adoption time + 2 years, every PS document would be Last Called for Draft Standard, for a period of 4 weeks. This would serve as:
- a final notice to submit implementation reports to the web site - a final notice to submit bug reports and errata to the web site
- At the end of the Last Call period the sheperding AD would review the implementation reports and bug reports and make a recommendation to IESG (similar to the AD writeup) to either:
- approve document as-is - submit to author or WG for updates - recommend that the document be reclassified as historic, experimental, or informational
Keith
p.s. The hardest part of this (and often, the hardest part of interop testing) is defining exactly what tests are needed, especially when features interact or when there are more than two parties participating in a protocol at the same time. Ideally each PS would specify what implementation tests were needed to move the specification to DS, and these would be published along with the specification. But that will have to wait awhile...
_______________________________________________
This message was passed through ietf_censored@xxxxxxxxxxxxxxxxxxxx, which is a sublist of ietf@xxxxxxxxx Not all messages are passed. Decisions on what to pass are made solely by IETF_CENSORED ML Administrator (ietf_admin@xxxxxxxx).
------------ Graham Klyne For email: http://www.ninebynine.org/#Contact