On 11 April 2018 at 10:02, Nico Kadel-Garcia <nkadel@xxxxxxxxx> wrote: > On Wed, Apr 11, 2018 at 4:43 AM, Alexander Bokovoy <abokovoy@xxxxxxxxxx> wrote: > >> I'm not in Ansible engineering or product management so take this with a >> grain of salt. My understanding is that cadence of Ansible releases and >> its aggressiveness in API changes makes it a bit less suitable to follow >> a traditional RHEL 7 release cadence. A separate product channel allows >> them to update packages at own cadence. >> >> I wonder how re-packaging for CentOS targets could happen with this >> approach and probably moving it back to EPEL7 is indeed something that >> makes more sense. > > Wouldn't a separate RHEL channel for a separate product, such as > ansible, mean a separate channel for CentOS to avoid precisely this > confusion? Mixing it into EPEL and having it on a separate RHEL > channel would be *bad* for anyone who activates that separate channel. > They'd have to filter it out of EPEL to ensure that the streams don't > get crossed on any updates from Red Hat. I understand that this is one > of the main reasons EPEL never carries packages that overlap with RHEL > published software. It is a lot more nuanced than that. EPEL builds packages that do not overlap with the following channels: rhel-7-server-extras-rpms/ rhel-7-server-optional-rpms/ rhel-7-server-rpms/ rhel-ha-for-rhel-7-server-rpms/ rhel-server-rhscl-7-rpms/ These are chosen because they were the base set originally and other channels which might be available can have items which conflict with each other. This means that EPEL can conflict with somethings inside of "RHEL" but so can things are in "RHEL". > _______________________________________________ > devel mailing list -- devel@xxxxxxxxxxxxxxxxxxxxxxx > To unsubscribe send an email to devel-leave@xxxxxxxxxxxxxxxxxxxxxxx -- Stephen J Smoogen. _______________________________________________ epel-devel mailing list -- epel-devel@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe send an email to epel-devel-leave@xxxxxxxxxxxxxxxxxxxxxxx