I would agree with Charles, this is what we have in mind for a large open source project such as ONAP.
Thanks,
Ramki
On Sat, Nov 4, 2017 at 7:18 PM, Charles Eckel (eckelcu) <eckelcu@xxxxxxxxx> wrote:
This is true in some cases, but not in others. There has been and continues to be a big push in open source software and in software in general to create applications that are compose of many small pieces that interact with each other through well defined interfaces. This leads to more modular software and better code/component reuse. This practice results in code that is more appropriate as a normative reference. I would expect the normative reference to be to a specific version or release of the software and its corresponding API(s). Those APIs could change with future versions of the software, but as with standards, we tend to try pretty hard to avoid non-backward compatible APIs changes after the software has reached a state of maturity that others on building on top of it and relying on those APIs.
Cheers,
Charles
-----Original Message-----
From: ietf <ietf-bounces@xxxxxxxx> on behalf of Keith Moore <moore@xxxxxxxxxxxxxxxxxxxx>
Date: Wednesday, November 1, 2017 at 4:10 PM
To: Alia Atlas <akatlas@xxxxxxxxx>
Cc: "ietf@xxxxxxxx" <ietf@xxxxxxxx>
Subject: Re: letting IETF build on top of Open Source technology
It seems to me that a fundamental problem with referencing Open Source software is that in much (not all) of the Open Source world, rapid innovation and evolution is valued more than stable protocol interfaces. There is a tendency to rely on users to frequently update their software, which is seen as giving developers freedom to change interfaces on relatively short notice.
Of course that's an oversimplification. In practice good open source developers are careful to avoid abrupt changes to interfaces; instead they create new interfaces and deprecate old ones. But since one of the oft-touted benefits of Open Source is to allow software systems to evolve rapidly, this sometimes also results in new protocol features being deployed rapidly to the point that it impairs interoperability between new and old implementations. This causes a great many problems other than just protocol interoperability failures, including dependency hell.
So part of the effort to improve cooperation between IETF and Open Source developers - who tend to have similar motivations even if their methods differ - might be in convincing them that they would benefit from more stable interfaces - at least on the wire - or at least encouraging them to define protocols so that they are cleanly extensible, and defining a baseline for interoperability even as they extend protocols to accommodate new features. And actually it might be the case that IETF could learn something about protocol extensibility from people who have struggled to accommodate rapid evolution of their software.
Keith
Thanks,
Ramki