Re: draft-housley-two-maturity-levels-00

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



In reply to a number of different threads:

* This proposal, however flawed some might think it is is certainly a much better description of current practice than the process document.

It is a fact that almost every 'IETF Standard' of any consequence was developed before the first meeting of the IETF. Meanwhile the most consequential IETF Protocol developed since is not an IETF standard. Except for MIBs, the only protocol documents that have achieved standard status have done so by being grandfathered.

My principle criticism of the existing process has always been that the IESG has been unwilling to either apply it as written or change the description to meet the practice. 


* The reluctance to spend time on progression to DRAFT status is not significant.

There is no way that I could justify the time and expense required to progress a spec from Proposed to Draft as matters stand because it is a mid-point to a destination I do not expect to reach. There is really no audience in which the distinction between proposed and draft is going to encourage adoption of a specification. 

I would not expect every Internet spec to go through the full process though. In fact there would be little point if they did. I would expect most Internet specs to stop at stage 1 and only foundational specs would go to stage 2 and then only if they had been successful.

For example, TLS and PKIX are clearly very successful and foundational. Stuff gets built on them both all the time. DKIM on the other hand is not foundational in the same way. At least not at present. And many of the design decisions taken in DKIM were made for reasons of expediency and not necessarily something you would want to see copied.

If people do start using the DKIM approach as foundational for other specs then we really should go back and revisit some of the design decisions and progress the spec to the next level. Otherwise there is no point.

Another similar example is DNSSEC, which is not going to be a real standard until it is deployed and being used to actually reject traffic. Any proposal of any real consequence is going to have to change during deployment (if deployment succeeds). So having two stages makes sense to capture the descriptions before and after.


* Current Proposed is actually the original requirement for draft

If you look at current practice, there is in fact usually a third stage only it occurs before the Proposed RFC is published. In the old days it was acceptable to throw some ideas together and slap out a 'proposed' RFC at an early stage of development. In the old days RFC stood for Request For Comments.

Today that step usually takes place outside the IETF or in Internet-drafts. In fact it is usually encouraged for a group of proposers to have developed something and have a writeup before going for a BOF. Such documents frequently end up as 'informational'.

Current requirements for publishing a Proposed Standard are considerably higher than they were for a draft standard. 


* Down References do not cause harm

The only criteria for accepting a reference should be that the description is sufficiently accessible and sufficiently well defined to enable interoperable implementations. 

I don't think it helped matters in the slightest to delay the publication of specs depending on PKIX while PKIX was revised to draft standard. The parts of SMIME and TLS that depended on PKIX were not the parts that were blocking the progress of PKIX.

Similarly, there should be some language in there to point out that a reference to a 50 year old expired patent is not a reason to object to a standards proposal. Referencing a patent does not change the liability incurred in the slightest. If the patent is enforceable it will apply whether cited in the text or not. The reason to avoid references to patents is that patents do not usually have a sufficiently specific description of the process to be implementable without ambiguity.


* Internet Standard status should require periodic review

I think it is pretty obvious that the Internet mail standard is not the one described in RFC821 and RFC822. It is even pretty obvious that you need to know and implement more than RFC2821 and RFC2822 if you want to get mail to arrive successfully.

Achieving standards status should not be the end of the matter.

Rather than having a third stage in the standards process I would like to see a periodic examination of standards to see if what they describe is a sufficiently complete description of reality.

Any worthwhile standard is going to evolve or die. Today NNTP and FTP are still in the canon. But further development is clearly impossible, they are both overtaken by events.

PKIX is going to remain relevant for some decades yet. But it will grow and change and as it does some parts will need some pruning. 
_______________________________________________
Ietf mailing list
Ietf@xxxxxxxx
https://www.ietf.org/mailman/listinfo/ietf

[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Fedora Users]