It probably also helped, back in the day, that so many people working on
protocols were doing so under funded grants.
Miles
Simon Pietro Romano wrote:
Hi,
I read the post from Vidya and I have to say I totally agree with her.
As to running code, in particular, my impression is that if you are
used to implement prototypes of ongoing standards and you're neither a
big company nor a member of the IETF elite of gurus, the best you can
buck for is an informational "call flows" RFC. Standards Track stuff
is left to those who (seem to) do the high-level specification work.
This happens because people seem to rush for editing documents as soon
as a new WG is chartered, but then they progressively reduce efforts
when such a WG starts to lose momentum and is not latest fashion any
longer.
Finally, coming to interoperability, it is hard to work on it if your
implementation is the only one available. In the long run, I do
acknowledge the fact that you get tired of doing all that hard work
and investing so many cycles in a highly underestimated engineering
activity.
My two cents,
Simon
On 14/apr/2014, at 18:34, Michael Richardson wrote:
George, Wes <wesley.george@xxxxxxxxxxx
<mailto:wesley.george@xxxxxxxxxxx>> wrote:
- We don’t have nearly enough focus on running code as the thing
that helps to
ensure that we’re using our limited cycles on getting the right
things out
expediently, and either getting the design right the first time, or
failing
quickly and iterating to improve
The solution here may be that we need to be much more aggressive at
expecting
any standards track documents to have running code much earlier in the
process.
For instance, had DMARC proponents and/or Yahoo, spent some time
making sure
that there was some running code for mailing list use, life would be
better.
I'm not entirely clear how it was that we produced/funded (more)
running code in the
1990s. Maybe this is a false idea; it could be that there was less
code then
than there is now. I will posit several factors:
1) there was less working occuring, and perhaps over a longer time
period
(where time is subject to perception as well as reality), such that
code became mature sooner in the specification process, and/or there
were simply more volunteers willing to produce it.
2) many companies were much smaller, and it was easier to get line
managers
to see why they wanted to be directly involved, even lead, efforts.
3) it wasn't so much the dotcom boom which made money available via VCs,
but rather that the (ultimately unstainable) revenue doubling,
quarter
over quarter which made resources available for prototypes.
4) there were some clear institutions (MIT, CMU, Berkeley, LLBL, UW)
where
some good reference implementations were developed by students,
faculty,
staff. And don't forget WIDE and USAGI!!!
When I founded Xelerance, it was with the idea that multiple large
organizations were shipping IPsec code on Linux, and would rather pay a
company a maintenance fee than attempt to manage the process internally.
We got some work funded, but we never got enough funding to get ahead of
the standardization process and write code will an ID was still young.
Overall, that effort failed.
--
Michael Richardson <mcr+IETF@xxxxxxxxxxxx
<mailto:mcr+IETF@xxxxxxxxxxxx>>, Sandelman Software Works
-= IPv6 IoT consulting =-
_\\|//_
( O-O )
~~~~~~~~~~~~~~~~~~~~~~o00~~(_)~~00o~~~~~~~~~~~~~~~~~~~~~~~~
Simon Pietro Romano
Universita' di Napoli Federico II
Computer Engineering Department
Phone: +39 081 7683823 -- Fax: +39 081 7683816
e-mail: spromano@xxxxxxxx <mailto:spromano@xxxxxxxx>
<<Molti mi dicono che lo scoraggiamento è l'alibi degli
idioti. Ci rifletto un istante; e mi scoraggio>>. Magritte.
oooO
~~~~~~~~~~~~~~~~~~~~~~~( )~~~ Oooo~~~~~~~~~~~~~~~~~~~~~~~~~
\ ( ( )
\_) ) /
(_/
--
In theory, there is no difference between theory and practice.
In practice, there is. .... Yogi Berra