On Sat, 13 Mar 2004, Xose Vazquez Perez wrote: >> .. and what is the problem with not having the absolute bleeding edge >> every day of the week ? Updating packages costs time, and not all >> changes in new upstream releases warrant spending that time immediately, >> nor does it always warrant invalidating the testing on the currently >> stable package. > >This is not taroon ml ;-). Developer time is developer time though, wether it is being spend developing on RHEL, or time spent developing on Fedora Core. Updating a package to a new version does not come at zero developer time cost. Depending on the particular package, the changes it includes, and how those changes might affect other packages in the distribution or dependancies need to be investigated closely. It may also require upgrading other things too that other package owners maintain, which might continue in a domino effect across the distribution depending on the package. While that doesn't happen for every package every time, it does happen enough that it has to be taken seriously so we're not wasting our time. If I spend an hour updating xchat for example, and then a week from now a new version comes out and I spend another hour on it, and then several other times in the devel cycle I update it again, that might be a total of 8 hours spent working on xchat. That's 1 work day out of a year. We have to ask ourselves if it is good use of Red Hat's resources to do that or not. For some packages it is, and the overhead of making the changes is small. For other packages it is not, and the overhead is large - and might not be immediately visible until you start doing the work - or worse, until after you've updated it and find out the new version of whatever you updated is loaded full of bugs and break in ways you might not have time to fix with your current schedule and priorities. You either then ship buggier but newer bits, or downgrade back to the old version and irritate a lot of people due to yum/up2date/apt not liking the package downgrade. Another problem which Arjan touched upon, is that when you upgrade a package to a new version, you lose the benefit of all of the beta testing done on that package up to that point. If it is a library, then all of the applications that link to it have lost their beta testing as well, because the library could now be introducing a bug that affects a large part of the OS. Again, different packages have different importance levels and impact in this regard. As someone pointed out, updating openssl would probably be bad due to the amount of work required, and the fact that that is a critical part of the OS, which you want to maximize beta testing and not take chances. But updating something such as procinfo is unlikely to cause major problems. So you're right, this is not Taroon, and we do ride the edge a little closer in Fedora Core, but that must be done very responsibly if we want Fedora Core to also be a reliable and useable OS. If a large part of the OS is updated near the end of the cycle, and most beta testing is lost, then the release will ship with many major problems, a lot of them only being found out after the release goes out the door. Then nobody will want to use it, it will get bad press reviews and negative feedback from the community, and the project suffers. It is nice to have newer versions of packages, but that has to be decided on a package by package basis, based on the merits of what each individual package benefits from if it is updated, and how much that matters in the grand scheme of things, balanced with how much man hour resources are needed to do all the work, etc. It is a gentle balancing game. That said, I think all of my packages are up to date except for XFree86, but we know the answer there. ;o) -- Mike A. Harris ftp://people.redhat.com/mharris OS Systems Engineer - XFree86 maintainer - Red Hat