Re: adapting IETF in light of github and similar tools

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 20-Apr-21 10:22, Keith Moore wrote:
> On 4/19/21 5:34 PM, Richard Shockey wrote:
> 
>> RS> GitHub.  The paradigm has shifted. It's no longer "rough consensus and running code." Its just running code.  Look at WEBRTC as a example.
> 
> Though at least at one time, the emphasis wasn't merely on "running 
> code" so much as on "specifications that permit multiple running code 
> implementations to interoperate".
> 
> These days there's less variation between platforms, so often it's at 
> least possible for different implementations of some applications to 
> share the same code base.   And with automatic software update, 
> applications can be upgraded to support new protocol features without 
> needing to get agreement on a protocol specification.  Rather than get 
> rough consensus on the change, whoever controls the repository 
> effectively wins.   

And if you're the least bit concerned about anti-trust laws, you would
want to be well away from that control point. That's a fairly strong
argument for why you need open standards as well as open source.

(Though forking does happen sometimes.)
> 
> Webrtc is certainly an example of something, but people might have 
> different opinions about whether it's a Good Thing.   At least the last 
> time I looked, it was extremely complex and difficult to implement from 
> a specification.  So it tends to be tied to these bloated 
> privacy-violating codes called web browsers that have a huge attack 
> surface and require very significant resources.

And it's no coincidence that WebRTC led to the largest and probably
trickiest set of interdependent RFCs in the history of the Internet.
 
> But I do think it's an example of how the paradigm has shifted, at least 
> for some kinds of applications.    How can IETF respond constructively 
> to that shift?    Somehow doing everything the github way doesn't seem 
> like adding value.  But there's even more demand than ever to be nimble, 
> to make huge changes quickly and without much time for review.   And yet 
> in some areas, particularly security, and probably also other 
> operational considerations, the need for careful design and 
> implementation are greater than ever.   Could we restructure our 
> processes to change the emphasis in our reviews to make sure we're 
> adding value in these neglected areas?

I thought we already did that by instituting the area review teams.
It might be time to review whether that's working properly.

> Would it make sense to make 
> running code part of our official work product, produced concurrently 
> with the specification and reviewed for consistency with the 
> specification?

Again I think anti-trust would be a factor, although we'd need legal
advice on that. Also possible liability concerns, ditto. However,
I am disappointed by the low uptake of BCP205/RFC7942. We need to
do better.

> Could we make greater use of protocol specification 
> languages to reduce the difference between specification and running code?

In theory we could, but it's far from easy. Take CBOR-based protocols for
a start. Yes, you can use CDDL for a formal specification of the message
formats, and you can in theory verify that the messages conform; you could
presumably write a CDDL-driven message parser. But none of that models
the protocol's state machine. That would be a whole new level of formalism.

People are trying, however: https://doi.org/10.1145/3341302.3342087

   Brian

> 
> Some of these things are already happening, of course, but could/should 
> we make them a more explicit part of our process?
> 
> Keith
> 
> 
> 





[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Mhonarc]     [Fedora Users]

  Powered by Linux