Re: deprecating Postel's principle - considered harmful

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Ted,

First, I think Eric’s point about the two main equilibria is a good one.  Please see below.

On 10 May 2019, at 22:32, Ted Hardie <ted.ietf@xxxxxxxxx> wrote:


On Fri, May 10, 2019 at 12:51 PM Eric Rescorla <ekr@xxxxxxxx> wrote:


There seem to be two main equilibria:

1. There's a critical mass of very strict implementations; in this case it's very hard for a non-conformant implementation to enter the ecosystem.
2. There's a critical mass of non-conformant implementations; in this case it's very hard for a strict implementation to enter the ecosystem.

Once you're in one equilibrium or the other, it's very hard to get out.


And from the perspective of the IETF as a standards body, an additional question may be whether an equilibrium has yet been reached.   If it has not, and the implementing groups are actively participating in the standards effort, then I believe Martin's analysis suggests Postel's principle is still correct, though incomplete.  Be conservative in what you send; be liberal in what you accept; be communicative about what went wrong. 

Yes.  That allows people to learn.  This thread has already noted that the problem may lie with an implementation or the standard itself.  Again, the latter is considerably more likely with immature standards than with mature ones, for the simple reason that we know what we are doing with the stuff that has been around a while.  We learned how to properly do TCP congestion management well after the TCP standard was written, and we learned how best to handle different forms of Email well after RFC 822 came out.  There was a lot of slack in these standards at the beginning.  Things began to tighten up with the publication of RFCs 1122  and 1123, and they continued with successive revisions of 822.  Pete and friends even found a way to get 8 bits into a 7 bit protocol.  Beat that with a stick.  And you were there, as I recall.



The advantage of that during the development process is that it allows you to test past the error to see if other parts of the implementation are correct and behaving as you expect.  Hard fails very early turn the testing process into a fully serial process, which may be too slow.  But this approach also comes with a very obvious cost, in that it forces implementing groups to participate in the process at least enough to know how to reach out to those whose implementations they are testing.  That tends to favor the well-funded and the well-connected over others, and it may mean that those who cannot participate will also find an equilibrium already set before they get to play.

Indeed.  Those engaging in new technology areas need Postel’s Law in order to prove the concept.  The trick is catching the shift from PoC to real stuff such that you don’t end up in the wrong equilibrium, and there are two: stuck with poorly behaving implementations and underployed due to too much friction.


Lowering that cost is important, as is being honest about whether an effort is currently in an exploratory phase or documenting an as-built system. Having chaired working groups which moved between those two without an intervening RFC publication, this is not as easy as looks, though I believe we are working on some better signposts.

And that’s a good thing.  This thread points out again and again that Postel’s Law was not intended to be the last say on any of this.  It was written in several different contexts, but with an eye toward a particular goal at a particular point in time.

Eliot

Attachment: signature.asc
Description: Message signed with OpenPGP


[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Mhonarc]     [Fedora Users]

  Powered by Linux