Re: deprecating Postel's principle - considered harmful

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Martin:

> On 13 May 2019, at 12:29, Martin Thomson <mt@xxxxxxxxxxxxxx> wrote:
> 
> On Mon, May 13, 2019, at 19:57, Eliot Lear wrote:
>> Indeed. Those engaging in new technology areas need Postel’s Law in
>> order to prove the concept. The trick is catching the shift from PoC to
>> real stuff such that you don’t end up in the wrong equilibrium, and
>> there are two: stuck with poorly behaving implementations and
>> underployed due to too much friction.
> 
> This might have been a fair analysis in the 80s or 90s, but I suspect that there has been a gradual shift away from deployment of less mature protocols.

It’s a fair analysis today.  We can and should encourage people to use mature protocols, where possible.  And let’s face it: HTTP+REST+JSON+TLS makes for a nice general model.  As a TCP/UDP port reviewer, I can tell you that my phone isn’t exactly ringing off the hook.  But even new uses of those technologies can have new implications that require tinkering.  I say this as someone tinkering with new stuff today.

> Today, I'd say that it is more often the case that standardization is occurring closer to the "mature" end of things.


I’m not talking about a new bell or whistle to an existing protocol.  I’m talking about new concepts.  They don’t come around too often, but they do come around.  This is happening today throughout the IoT space, in many dimensions.  Worse- IoT manufacturers will rarely implement without a standard, simply due to real estate constraints (mem, nvram, cpu).  But that doesn’t mean that the standard is all THAT mature.  Remember: HTTP got redone after 25 years of experience on non-constrained devices.  Think we’re going to get it all right on constrained devices in much less time?


> This is partly because activation energy for new stuff is fairly high, and so high-friction standardization isn't that much of a delta.
> 


Your opinion.  One of the biggest problems in the vertical I’m in is that of competing standards that drive up costs.


> If nothing else, the security requirements the network imposes on new protocols is enough to make careful standardization look relatively cheap.


With new concepts, and even new or aggregated uses of old concepts, security remains often the last concern.  The first concern is, “can we make the function work at all?”  The second concern is, “Can we make it scale?”  Security chimes in some time way after that.  That is reality.  You may not like it, but the newspapers are littered with this fact.  Don’t believe me?  Open today’s Wall Street Journal and read about the state of healthcare technology.

And so what are you writing this for?  A new knob or feature or for new concepts?  Maybe you’re writing it for both.  That’s okay, but be clear.

Eliot

Attachment: signature.asc
Description: Message signed with OpenPGP


[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Mhonarc]     [Fedora Users]

  Powered by Linux