Re: conformance testing [wasRe: Proposal to revise ISOC's mission statement]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



--On Saturday, October 28, 2017 07:30 -0400 Russ Housley
<housley@xxxxxxxxxxxx> wrote:

>> On Oct 27, 2017, at 10:00 PM, Brian E Carpenter
>> <brian.e.carpenter@xxxxxxxxx> wrote:
>...
>> Some of us were very badly burned, in one way or another,
>> by formal conformance tests of OSI implementations the
>> best part of 30 years ago. So while I fully support Michael
>> on "facilitating conformance testing" and would even insert
>> the word "rigorous", I would be very cautious about "formal
>> methods" (except for things like MIB modules and YANG, where
>> clearly a formal check is required).

As Carsten pointed out, many people have suggested that the
formal methods and the way conformance testing was carried out
where what killed OSI.   Personally, and having been involved in
the oversight process at the time, I think these were very long
list of reasons, any one or two of which might have been
sufficient.  While it has little to do with this discussion, at
least one other one may be instructive for the IETF today:
complex protocols, with many cases and options, are a bad idea,
especially when they are caused by "you can have yours if I get
mine" compromises.  Those also tend to spawn checklists and
conformity tests.

>> Interoperability remains, IMHO, much more important than
>> formal correctness. Implementations can be formally correct
>> but faulty in practice.
 
>> In any case, I don't think that distinction is relevant to the
>> ISOC mission, which needs to retain some level of abstraction.
> 
> Care must be taken that ISOC does not create an unfortunate
> feedback loop by sponsoring standards development and
> conformance testing of implementations.  Testing should
> provide feedback into the standards process, but testing
> should not drive the standards process.

I agree but would have said something a bit different.  Maybe
they are ultimately the same thing.  

First, if protocols are straightforward, with few options and
clear descriptions, conformity tests are rarely needed although
implementations (at least of proof of concept quality) and
interoperability tests clearly are and provide important
feedback (or at least sanity checks).  Conformity tests go with
complex standards, standards with many options, and those
checklists.  The difficulty is that, especially if provided by
the standards body or its oversight body, the conformity tests
tend to become, de facto, the standard.  Things that they leave
out (typically because they are hard to characterize with a
simple test) are often operationally important or critical to
interoperability while small details (often of the variety we
traditionally thought the robustness principle would take care
of) are treated as very important.  

Because I'm ranting already, it is probably also worth keeping
in mind that there is an inherent tension between standards and
one of the original motivations for Open Source software, i.e.,
that progress comes from each of us being able to modify systems
to reflect our ideas and design tastes rather than having
someone else tell us what we need and how it should be dispensed
to us.   That usually works very well for end systems and most
applications running on them but can be the enemy of
interoperability and substitution of components, which have
traditionally been the main motivations for standards (of all
types, including whether a bolt one obtains from one source fits
a nut of the same description obtained from another).

best,
    john




[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Fedora Users]