Re: Migrate away from vger to GitHub or (on-premise) GitLab?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Sun, Feb 04, 2024 at 04:47:14PM +0100, Michal Suchánek wrote:
On Sun, Feb 04, 2024 at 04:12:02PM +0100, Oswald Buddenhagen wrote:
but after working with gerrit code review for over a decade, i find
it
mind-boggling that people are still voluntarily subjecting themselves to
mail-based reviews for serious high-volume work.

I have yet to see gerrit in action. Very few projects use it so it's
difficult to gauge what tradeoffs compared to e-mail based workflow it
does provide.

from my just slightly biased perspective ;-) i can't see any significant
trade-offs except for some set-up cost (that will quickly pay for
itself).

in fact, my gerrit workflow is still "e-mail based", in that everything
is driven by the notification mails, only that i "branch out" to the
browser whenever something interesting happens.

for the CLI hardliners there are gertty and emacs egerrit, but i see no
point in using either despite being a heavy CLI and TUI user. given how
"much" attention these tools get despite there being literally tens of
thousands of regular gerrit users, i'm inclined to think that there is
indeed very little demand.

gerrit also has an incoming email gateway, but i'm not sure how advanced
it is - at some point it required well-formed html replies as input.

if one really wants to, one can install a webhook or event stream
watcher that posts all activity to a mailing list (and i don't mean
_the_ list, because it would be just noise on top of everyone's
individual notifications).

migrating the workflows that are worth keeping isn't such a bit deal.

Have you migrated them to gerrit already, and t[a]ught all the git
contributors how to use them from gerrit?

that challenge is sort of meaningless, because the only workflow within
the git project that i'm aware of that would affect "all the git
contributors" is the interaction with gerrit itself. which has a very
steep, but also extremely short learning curve. and there are tools to
meke it more pleasant - https://wiki.qt.io/Git-gpush-scripts (yep,
shameless self-promotion here).

i'm not aware of any pre-integration build bots, so nothing to do on
that front except for some mirroring adjustments (gerrit insists on
being its own authoritative git server).

gitgitgadget would just become obsolete, to be replaced by the github
integration plugin.

my main concern is with the maintainer workflow:

the way gerrit is usually used, the contributor determines the target
branch, and the changes are merged directly to it after they are
approved. unclean merges must also be eliminated during review. that
works just fine, but it doesn't match the refs and merge commit messages
junio produces. and while aggregating pending changes into `seen` would
be still perfectly possible (each change including its dependencies is
just a ref), it would be somewhat awkward due to the naming and location
of the change refs.

to reproduce the existing merge workflow more faithfully,
- junio would have to monitor incoming changes, manually create an empty
  branches for each topic, and change the target branch of all changes
  in each topic
- the gerrit-side integration would then happen into that branch
- junio would then proceed with manually merging the branch and
  direct-pushing (that is, not creating a review for it) the merge into
  next or maint
this is reallly just the current workflow, and can be equally automated,
just with slightly different tooling. only it's ... weird for gerrit,
artificially creating a bottleneck. the gerrit integration workflow is
naturally decentralized.

personally, i would just switch to the usual gerrit workflow, and at
least for `seen` use a merge-free workflow -- with gpick (gpush
complement, see link above) it's absolutely trivial to track all
interesting branches stacked onto each other.

Somobody has to do it.

yes. and if nobody does, then everybody keeps paying the cost of not
doing it. that might incentivize Somebody (TM) with the resources and a
vested interest.

Also can you migrate away from gerrit once it becomes defunct or new,
better alternative emerges?

Recently it seems that forges offer a 'download your project data'
option, probably as a result of GDPR. What use is such data blob though?

current gerrit keeps the meta data in (yet more) awkwardly named refs
containing plain-text files, so that's no issue. one could render it
into a read-only view, or convert it.

An e-mail archive is that: an archive. It's a medium that you can read
with a wealth of software today, and 100 years from now. An achivable
data format.

with a mailing list archiving the event stream, we'd have that.

Compare that with the 'download your data' blob from a forge. Can it be
uploaded even to a diffferent instance of the same forge to restore your
project elsewhere? Interpreted by any tool othar than the correct
vintage of that same forge? Deos even more than one instance of the
forge exist?

a review meta-data standard is being discussed from time to time, but it
hasn't gone anywhere yet.

but looking at it from a practical perspective, with a list-based
archive the situation wouldn't be any worse than it is right now if
gerrit was to suddenly disappear.

during my tenure at the qt project i established a commit policy (and
deployed tooling to help enforce it) that presumes that gerrit could be
replaced at any time, so commit messages are not supposed to refer to
other commits by gerrit change ids or review urls. (in principle, this
works even for pending and abandoned changes, as each patchset (revision
of a change) keeps its commit and therefore sha1 forever.)
of course that's not practical for regular mailing list posts, but one
can't have everything ...

And even if you do convert to gerrit it's unlikely to satisfy the "Why
are you not using github or gitlab" crowd. It's not one of the big,
popular forges they are familiar with, the UX is significantly
different.

i can confirm that in the qt project this is indeed absolutely the case,
and the last related thread isn't even cold yet.
but why should the people with standards care? lowering the barrier to
entry is all dandy, but not when it causes significant detriment to the
workflow of those who do most work.
note that the original request in this thread was for "structured
discussion", and gerrit would absolutely provide that, among other
things.





[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux