Bernard:
Many of the IESG activities are listed in John's appeal. The DISCUSS
Criteria document is probably the biggest step that was taken. ADs
routine challenge each other to stay within those guidelines.
At the IESG Retreat we had a discussion on this topic. It is very
hard to measure. During the discussion, we quickly discovered that
there are a number of DISCUSS positions related to some comment that
was raised in Last Call but not addressed in any way. One cannot
call these "late surprises" and most of them are resolved very quickly.
We have a way to count DISCUSS positions, but we do not have a way to
figure out what percentage of them are perceived as "late surprises"
by the community. So, while we are taking action in an attempt to
make things better, we do not have a way to measure our success or
failure beyond community perception. Suggestions on making this more
objective and less subjective are greatly appreaciated.
Russ
At 10:41 PM 6/23/2008, Bernard Aboba wrote:
Russ Housley said:
"I agree with this principle. In fact, I think that the IESG has
taken many steps over the last four or more years to reduce the
nearly-end-of-process surprises. Obviously, you do not think these
measures have been sufficient. One lesson from the many attempts to
make updates to RFC 2026 is that such policy documents needs to set
expectations without taking away flexibility and judgement. "
Can you elaborate on what steps the IESG has taken to reduce the
"nearly-end-of-process surprises" and why effect this has had, if
any? For example, have the delays resulting from IESG reviews
actually *decreased* as a result?
The research by Prof. Simcoe of the Rotman School is not encouraging:
<http://www.rotman.utoronto.ca/strategy/research/working%20papers/Simcoe%20-%20Delays.pdf>http://www.rotman.utoronto.ca/strategy/research/working%20papers/Simcoe%20-%20Delays.pdf
_______________________________________________
IETF mailing list
IETF@xxxxxxxx
https://www.ietf.org/mailman/listinfo/ietf