> 1. When we start an effort, we do not press for demonstrated community > need -- but more importantly, demonstrated community interest in /using/ > the output. So the folk who work on a topic tend to have no sense of > urgency. (Even when there is a claimed sense of urgency, such as for STIR, > the work often is not pursued in a fashion that matches the claim, with an > eye towards rapid development and deployment.) This is certainly true... But I think there is a second reason in this neighborhood that also relates to this one -- > 2. The folk making IETF approvals feel an unfortunate fear of letting > flawed specifications through the process, even though the fear does not > produce obviously superior results. So we impose high barriers to entry and > high barriers to completion. We've lost the art of base spec -- leave other stuff to later. Maybe I'm just being nostalgic, but I seem to remember a time when we would pass through a base protocol with extensibility, and then start talking about extensions on a case by case basis. Now we seem to see 15-20 drafts proposed in a few months, all with interlocking bits and pieces, totaling hundreds of pages of text, and sounding more like a bill being presented before some legislative body rather than a technical specification. These large scale "boil the ocean" efforts constructed (apparently) by off line meetings outside the mailing list and the "normal process," are challenging (to say the least) to even read, must less to fully participate in. When someone does try to discuss one of these monstrosities on list, the reply is either "you're stupid," or "you didn't really read all the drafts," or some such, shutting the discussion down. Of course no-one has really read the drafts -- they're essentially unreadable, and they describe a system of massive complexity that few people probably understand -- even the authors. I'm certain each author understand some small bit, but the overall system is far too complex to be understood by anyone who doesn't have time to dedicate themselves full time for several weeks to understand it. This doesn't "improve the speed," as some folks claim--biting off smaller chunks would actually be faster, as it would increase community participation, and help us to drive simpler specifications that people outside the IETF could actually read and understand. If we could get out of the habit if "boiling the ocean," then we could, possibly, get back to doing simple things quickly in a serial fashion with a lot of participation. What we seem to be doing, instead, is a lot of large scale systems that often overlap in parallel with specifications so complex and so poorly written that we don't have high overlapping participation rates, which means speed and innovation suffer, and quality creeps towards atrocious. At least that's my view of the process at this point. What we have on the speed front is a culture issue as much as anything else. :-) Russ