>Phill,
>>As a result the IETF is a standards body with 2000 active
>>participants that produces on average less than 3 standards a year
>>and typically takes ten years to produce even a specification.
>
>It is well understood that the Internet mainly runs on Proposed Standards,
>so the appropriate metric is how many Proposed Standards the IETF produces
>a year. I think you will find that is more than three.
For humans what counts is not reality but their common vision of
reality and the way to proceed. The Internet has dramatically
increased this to the point we have accepted it as a virtual and a
global world, i.e. a conceptual and geographical equivalent coverage
to reality. The IETF is therefore in the core of this, having to
engineer, in the reality, the support of what people are to believe
to be their _unique_ virtuality. What Phil actually talks about is a
TTV (time to virtuality): the time between a 00.txt and its concept
is being globally accepted as the way most perceive as "their reality".
There are several key factors which give the feeling this process
IETF is slow and seldom.
- deployment. This is not because people are not yet aware of them
that things have not been settled (IESG approved, appealed,
published, loaded on the IANA).
- lack of clear message. The RFC system is not accompanied by a
network ontology RFCs would update. There is therefore no description
of the virtuality the IETF develops and the world is to beleive in.
- validation by the market of the final product, instead of a
concerted effort with all the concerned parties. The users know that
some RFC will abort. The hysteresys is very long.
- reality is diverse, so the virtuality must be diverse to the power.
Yet, the IETF virtuality is not. RFC 3935: IETF wants to influence
THE way people design, use, and manage the Internet. This
Mono-Internet vision is in opposition with the diversity of reality.
Hence the NATs, the opposition to the single out-dated IPv6 numbering
plan, the tensions created by the single root, the globalisation
divide built by the way RFC 4646 is disrespected and therefore not
interoperable.
- the increasing size of the involved users and entities, their
relational density, and their resulting capacity to self-standardization.
There are probably others. But these already show that the IETF is
only outdated, not when compared with the rest world (it would then
be in advance), but when compared with its own purpose and the system
it produces. IMHO this comes from its decision method (rough
consensus). It is a major step _ahead_ over "democratic" votes, but
there is still a long cultural way to reach the adequate "concerted
consensus" necessary to the subsidiarity of our networked technical,
societial, industrial, political diversified world.
A "concerted consensus" means that all the concerned parties are to
be part of it, in their specific capacities and interests, and that
the consensus is not over a single solution, but over the outcome.
The outcome must address all the involved positions with the
interoperable network of all the solutions each of the concerned
party will adopt. This is more complex, but this is the way we live,
in intergovernance. Rather than fighting NATs, we should have an
architectural comprehensive doctrine integrating them. The same for
the name spaces. The same for the addressing plans. My own
difficulties in making an interoperable RFC 4646 adopted, now
enforced, and later on protected, show that this will still take some
time to stabilise.
jfc
_______________________________________________ Ietf@xxxxxxxx https://www1.ietf.org/mailman/listinfo/ietf