Hi Russ,
At 12:28 PM 8/3/2011, Russ Housley wrote:
I am well aware of the implementation reports. The premise here is
that the protocol specification is "good enough" there are at least
two interoperable implementations and the protocol is deployed
widely. The implementation report would become optional.
One of the advantages of an implementation report is that it provides
a statement about interoperability between two or more known
implementations. If there is any dispute about that claim, it can be
resolved in a non-controversial way. Determining whether a protocol
is widely deployed is not always a clear-cut decision.
People are not doing many implementation reports. As you say above,
there are only about 75 of them. How many protocols are documented
in RFCs? That is a very low percentage in my view.
Yes, it's a very low percentage. I don't have the figure for the
number of protocols documented. Given the low barrier for such
reports, I would have expected to see more reports. After all, if
the RFC has been published, the protocol has been widely deployed, it
should simply have been a matter of filing the short report.
From draft-housley-two-maturity-levels-08:
"this document measures interoperability through widespread deployment
of multiple implementations from different code bases, thus condensing
the two separate metrics into one."
This change is expected to solve the problem. I am not convinced
that the metrics is the problem.
So, I see the cost quite differently. Most protocols are published
as Proposed Standards, and they are never advanced. I'm seeking a
process where implementation and deployment experience actually
improves the protocol specifications. Today, that rarely happens,
and when it does, the
Agreed.
I didn't find any incentive to inject implementation and deployment
experience into the process.
This is an argument for the status quo. We have decades of
experience with that not working. That is essentially an argument
for a single maturity level; that is how the process is really working today.
I am not arguing for a single maturity level (the status quo). I do
not agree with the conclusion that the decades of stagnation is due
to the three maturity level.
This document is not about IESG review time, except for the
elimination of the requirement for annual reviews which are not done
anyway. If that is what you get from the document, then I have done
a very poor job in writing it. That is not the point at all.
I don't think that you did a poor job. A three maturity level
requires three IESG Evaluations. A two maturity level requires two
IESG Evaluations. If more documents moved from Proposed Standard to
the next level, it would obviously take more IESG review time.
I presume that the IESG will only use the following criteria for advancement:
- two independent interoperating implementations with widespread
deployment and successful operational experience
- no errata against the specification
- no unused features in the specification
And there won't be any DISCUSSes along the lines of:
"I don't think the implementation reports are adequate for me to meet the
requirements of 2026. It does not clearly identify what software
was used or
show support of each of the individual options and features."
"Examples througout the document make use of non-example domains."
"The implementation report is woefully inadequate to document there are
interoperable implementations of all the features from two different
code bases."
"My Discuss was not addressed at all - I believe that the WG ignored the
spirit of the implementation report requirement - my Discuss said that
we should know that there are multiple implementations that have
handled the significant changes in the recycling of this Draft Standard.
The group apparently refused to update its implementation report"
Regards,
-sm
_______________________________________________
Ietf mailing list
Ietf@xxxxxxxx
https://www.ietf.org/mailman/listinfo/ietf