On Mon, Oct 28, 2019 at 1:43 PM Kevin Fenzi <kevin@xxxxxxxxx> wrote: > > On Tue, Oct 22, 2019 at 02:09:04PM +0200, jkonecny@xxxxxxxxxx wrote: > > > > I guess it will be easier to just think about the branching date when > > Flock schedule is creating. However, I'm not familiar with the > > scheduling so I'm probably not the right person who should answer this. > > Perhaps Ben Cotton could chime in here. I think now we have moved to > planning the schedules years in advance, it's the flock schedule that > moves around a lot based on when facilitys are available and other > things. I guess we just need to take the branching date in account if > it's looking like flock will be near it? > I can! I had a nice conversation with mboddu last week when I was in Westford and the short answer is that there are no easy options. You're right that Flock will move around some based on facility availability, cost, etc. Now that the schedule is more predictable (or at least more explicitly stated), we can try to accommodate it in the Flock planning. In the next week or so, I hope to publish a commblog post that will include a few different options and what impact those options will have on the schedule. In the meantime, I'm curious about the history here. In my 10-ish years in the Fedora community prior to taking this job, I never really paid that much attention to the branch point. Is this a problem we've had in the past, or was F31 particularly bad. I know we get failed composes a lot, but my understanding is that this was a perfect storm. -- Ben Cotton He / Him / His Fedora Program Manager Red Hat TZ=America/Indiana/Indianapolis _______________________________________________ devel mailing list -- devel@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe send an email to devel-leave@xxxxxxxxxxxxxxxxxxxxxxx Fedora Code of Conduct: https://docs.fedoraproject.org/en-US/project/code-of-conduct/ List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines List Archives: https://lists.fedoraproject.org/archives/list/devel@xxxxxxxxxxxxxxxxxxxxxxx