Re: A proposal for a scientific approach to this question [was Re: I'm struggling with 2219 language again]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hear. Hear.  Ditto!  Absolutely.  etc.

At 10:12 AM -0500 1/8/13, Donald Eastlake wrote:
Another problem is maintenance. Protocols change. Having to maintain a
formal specification is commonly at least an order of magnitude more
effort than maintaining a prose description. So it doesn't happen and
they very rapidly get out of synch in any living protocol. As an
example, the IEEE 802.11 (Wi-Fi) standard used to have a normative
formal description of the MAC operation (see Annex C of 802.11-1999).
By 802.11-2007 this was out of synch but was still included as
informational material on the theory it might be of some use and the
section began with the words "This clause is no longer maintained...".
Although still present as informational in 802.11-2012, the first
words of that section are now "This annex is obsolete."

And, as has been mentioned before, I'd like to emphasize that the IETF
experience and principal is that, if you want interoperation,
compliance testing is useless. The way to interoperation is
interoperation testing between implementations and, to a first
approximation, the more and the earlier you do interoperation testing,
the better.

Thanks,
Donald
=============================
 Donald E. Eastlake 3rd   +1-508-333-2270 (cell)
 155 Beaver Street, Milford, MA 01757 USA
 d3e3e3@xxxxxxxxx


On Tue, Jan 8, 2013 at 9:45 AM, John Day <jeanjour@xxxxxxxxxxx> wrote:
 The reasons have been discussed or at least alluded to previously in this
 thread.  The short answer is we have been there and done that: 30 years ago.

 All those tools were developed and used successfully in the 80s.   I know of
 cases where doing the formal specification alongside the design phase caught
 lots of problems.   However, there are two central problems:  First, in
 general, programmers are lazy and just want to code. ;-)  Using a formal
 method is a lot more work.  Second, the complexity of the formal statements
 that must be written down is greater than the code.  So there is a higher
 probability of mistakes in the formal description than in the code.
 Admittedly, if those statements are made, one has a far greater
 understanding of what you are doing.

 Once you have both, there is still the problem that if a bug or ambiguity
 shows up, neither the code nor the formal spec nor a prose spec can be taken
 be the right one.  What is right is still in the heads of the authors.  All
 of these are merely approximations.  One has to go back and look at all of
 them and determine what the right answer is.  Of course, the more things one
 has to look at the better. (for a bit more, see Chapter 1 of PNA).

 John


 Which raises the obvious question:  Why do we not write protocol specs
 in a formal specification language instead of struggling with the
 ambiguities of natural language?

 Theorem provers and automated verification tools could then be brought
 to bear on both specifiations and implementations.



 Dick
 --





[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Fedora Users]