Re: RANT: posting IDs more often -- more is better -- why are we so shy?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi All,

This thread started with a comment that github versions that lived in-between ID versions did not have plain text representations - and hopefully that specific misconception has been put to bed with the counterexample of reasonable tooling mentioned upthread. Apparently this has unearthed a spirited defense of the old way against a new way. So let me espouse the virtues of the new way using John's outline for paradigms of participation in a wg.
 
(1) Very active participation, including tracking and
understanding all changes, more or less in real time.

 
what tends to happen for this scenario in a github world is that a lot of the native-wg-email moves into github-issues-email (you can still subscribe and reply to these directly from your mailer or via any number of alternative github interfaces) - with much more disciplined threading than happens traditionally. A busy document benefits greatly from the tagging these issues can have - an extra level of metadata and state tracking that email alone can't give you. Multiple suggestions can be attached as pull requests to the issues and each can be discussed independently and in context - again, that discussion can all be gatewayed to email. I promise you will still send and receive too much email - I have 18 github gatewayed comments re QUIC in my inbox in the last 24 hours. Each one very specific and substantive - Its just a better structured wg discussion in my inbox. But the unstructured list persists for topics that aren't ready a github issue yet. TBH if I were to only read one of them it would be the github issue gateway - the signal to noise ratio is better.

have a look at https://github.com/quicwg/base-drafts/issues?utf8=%E2%9C%93&q=is%3Aissue .. what I see is essentially a very well threaded mailing list archive plus some useful metadata (classifications, open-vs-resolved, tags about stuff that needs more discussion, issues that are basically decided but an editor needs to propose some text, etc..). And you can consume/contribute with a mailer if you are happy with the old way.
 
(2) Intermittent review of snapshots, ones that are generally
believed by their editors to be coherent and self-consistent.
Github revision tracking often makes that approach very
burdensome; change logs in documents and diffs between versions
are often more helpful.

This seems to suggest that github based drafts go from -00 to WGLC with interim changes only happening on github. That's not the way it works in my experience - to me it feels like the same number of drafts are produced (some data here would be interesting.. http/2 had around 19 revisions at the end iirc even though we used github extensively.. I believe tls 1.3 is on draft 19 or 20 and it is also github centric) and they continue to have changelogs. Git pushes that are not also uploaded to datatracker are simply a better (more visible) version of the locally saved version an editor has always done in between updates. Periodically, new datatracker releases are made to garner wider visibility, test interop, and as a signal that things are, as you say, internally consistent.

Reviewing these new datatracker revs can be done the same as always (diff between -N and -N+1, changelogs, etc..), but it can also be so much more.. you can git blame your way back to a changeset and probably an issue # from the commit message.. from there you can read the discussion that went into the change if you weren't able to tune in in real time. If you're not cool with the way things stand, you just open a new issue (or send an email to the list and the editor or chair will open an issue).

If you want to review two non-data tracker revisions (e.g. you normally track things in real time but you were out of office for a week) you can use the same tools but with git revisions. version control software is pretty good at that :)


(3) Simply deciding it is all or nothing and waiting until IETF
Last Call.

Only the first really makes effective use of the github style of
doing things.


I disagree with this conclusion - the accumulated repo history is most valuable near the end of the process. We finally have something more than a messy email database as a record of the process.

This is especially powerful during the last calls, when issues from some-time-ago can be re-raised (hopefully with new information!).. rather than everyone spelunking through the archives to remember how it ended up like that (or simply retorting 'read the archive') there is some structure to the history of the document - the 'git blame' and the closed issue threads intersect powerfully for understanding how consensus was reached.

There is also the added bonus of having to close all the open issues before a document can be published. Stuff doesn't get lost in the shuffle (though it may be over taken by other changes and closed without action - but at least it is done with deliberate review). Its better project management.

I freely admit I am discussing what are emerging as best practices and I hope ietf-and-github@xxxxxxxx helps codify some of this. This system could certainly be run poorly - but that's not a reason for avoiding well run ones. The old email and upload XML snapshot approach doesn't live in the Internet ecosystem that we seek to serve for good reasons.

[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Fedora Users]