On Sun, Feb 18, 2018 at 11:40:47AM +0100, Fabio Valentini wrote: > On Sat, Feb 17, 2018 at 11:15 PM, Zbigniew Jędrzejewski-Szmek > <zbyszek@xxxxxxxxx> wrote: > > Bodhi currently provides "batched updates" [1] which lump updates of > > packages that are not marked urgent into a single batch, released once > > per week. This means that after an update has graduated from testing, > > it may be delayed up to a week before it becomes available to users. > > > > Batching is now the default, but maintainers can push theirs updates > > to stable, overriding this default, and make the update available the > > next day. > > > > Batching is liked by some maintainers, but hated by others > > Unfortunately, the positive effects of batching are strongly > > decreased when many packages are not batched. Thus, we should settle > > on a single policy — either batch as much as possible, or turn > > batching off. Having the middle ground of some batching is not very > > effective and still annoys people who don't like batching. > > (snip) > > > To summarize the ups (+) and downs (-): > > > > + batching reduces the number of times repository metadata is updated. > > Each metadata update results in dnf downloading about 20-40 mb, > > which is expensive and/or slow for users with low bandwidth. > > This savings effect is negligible, because metadata has to be updated > even if only 1 urgent security update is pushed to stable. [FTR, it's any urgent update, security or not.] Yes, but we don't have urgent updates every day. Even if we have them every other day, that'd still be 50% reduction in metadata downloads. > > + a constant stream of metadata updates also puts strain on our mirrors. > > > > + a constant stream of updates feels overwhelming to users, and a > > predictable once-per-week batch is perceived as easier. In > > particular corporate users might adapt to this and use it to > > schedule an update of all machines at fixed times. > > I'd rather want to see a small batch of updates more frequently than a > large batch that I won't care to read through. Yes, but I think you are in the minority. I'm pretty sure most users don't bother reading descriptions. Of course it's hard to gauge this, but I'd expect that with the frequency of updates in Fedora only very dedicated admins can look at every package and every changelog. Most people install the whole set and investigate only if something goes wrong. > > + a batch of updates may be tested as one, and, at least in principle, > > if users then install this batch as one, QA that was done on the > > batch matches the user systems more closely, compared to QA testing > > package updates one by one as they come in, and users updating them > > at a slightly different schedule. > > Well, is any such testing of the "batched state" being done, and if it > is, does it influence which packages get pushed to stable? Sorry, I don't think we have any data on this. Maybe adamw and other QA people can pitch in? > > - batching delays updates of packages between 0 and 7 days after > > they have reached karma and makes it hard for people to immediately > > install updates when they graduate from testing. > > This delay can be circumvented by maintainers by pushing directly to > stable instead of batched (thereby rendering the batched state > obsolete, however). I meant that it is hard for *end-users*. Essentially, end users lose the control of the timing, even though individual maintainers can still control the timing of their updates. > > - some users (or maybe it's just maintainers?) actually prefer a > > constant stream of small updates, and find it easier to read > > changelogs and pinpoint regressions, etc. a few packages at a time. > > I certainly belong to this group. > > > - batching (when done on the "server" side) interferes with clients > > applying their own batching policy. This has two aspects: > > clients might want to pick a different day of the week or an > > altogether different schedule, > > clients might want to pick a different policy of updates, e.g. to > > allow any updates for specific packages to go through, etc. > > > > In particular gnome-software implements its own style of batching, where > > it will suggest an update only once per week, unless there are security > > updates. > > Which further delays the distribution of stable updates by up to a > week (depending on the schedule of gnome-software, I didn't check > that). That makes a total of up to 3 weeks (!). > > > Unfortunately there isn't much data on the effects of batching. > > Kevin posted some [2], as did the other Kevin [3] ;), but we certainly > > could use more detailed stats. > > > > One of the positive aspects of batching — reduction in metadata downloads, > > might be obsoleted by improving download efficiency through delta downloads. > > A proof-of-concept has been implemented [4]. > > A simpler approach might be to just flush all batched updates to > stable if there is at least one update (possibly an urgent security > update) anyway. That way, the metadata don't have to be downloaded for > just one update, and all packages reach stable sooner. There are two aspects: * metadata update frequency — and for this you are right, pushing everything out when at least one package is scheduled, reduces the delays without causing any additional metadata costs - "predictable batches" — if we care about this, keeping non-urgent updates for the scheduled batch still makes sense. [snip] > -> I'm in favor of dropping the "batched" thing as it is currently implemented. Noted. Zbyszek _______________________________________________ devel mailing list -- devel@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe send an email to devel-leave@xxxxxxxxxxxxxxxxxxxxxxx