Re: "why I quit writing internet standards"

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> I'm surprised that no one has sent this out yet:
> http://gigaom.com/2014/04/12/why-i-quit-writing-internet-standards/
> 
> "Summary: After contributing to standards organizations for more
> than seven years, engineer Vidya Narayanan decided it was time to
> move on. Although she still believes that these organizations make
> the Internet a better place, she wonders about the pace of change
> versus the pace of organizations."

> "while the pace at which standards are written hasn't changed in
> many years, the pace at which the real world adopts software has
> become orders of magnitude faster."

This is key, in a way.  But the pace at which software has been
adopted hasn't changed.  Nor has the pace at which the real world
adopts *interoperable protocols* changed -- and that has always been
much slower.  What has happened is that a lot of the focus (and money
and employment) has moved from adopting new protocols to adopting new
software, software which is under no interoperation constraints.

A simpler comparison is between the compilers for two languages:
Fortran and Perl.  There have been many, many Fortran compilers
written, and at any era, there has been the effort to standardize the
language that the compilers accept, so that a program can be compiled
using more than one compiler.  In reality, to ensure that the
compilers interoperate.

Now look at Perl.  There has only been one compiler in all of history,
so that any Perl program that works on one computer works on every
computer that implements Perl -- because it's always run using the
same compiler.

In this case, it is a side-effect of open source, I think, because
open source eliminates the financial pressure on computer vendors to
write their own compilers for a language.

A similar thing has been happening in other fields.  An internal
feature can be added to Apache and distributed to the world in
months.  But adding a feature to HTTP can take years, because it has
to be designed and implemented to be interoperable -- in this case,
between the HTTP servers and the HTTP clients.

In the current Internet business, a lot of the new business is at the
application layer, and it is composed of "walled garden" web sites,
which don't have to interoperate with anything else.  In the more
elaborate cases, the user's web browser effectively downloads the
client software at the point when the user accesses the server.  There
are no true interoperability constraints.

In situations where you are going to need "two independently-developed
interoperable implementations", a standards process is still needed.
But there are a lot of high-profile situations where that isn't
needed.

In regard to speeding up the standards process, it's clear that in
most cases, the process could in principle be carried out much faster.
In all the cases I've observed, the major problem is that the
participants can only spend a small fraction of their working hours on
the standards discussion.  I assume this is because it's rare that
getting the standard completed is gating to shipping their employer's
product.

Dale





[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Fedora Users]