> This has been suggested before, and of course you can do it by hand, but > bandwidth is bandwidth and once the headers have been created/compressed > all you're really saving is the per-transfer overhead, not the bandwidth > per se. with keep-alive and http you're not gaining anything, really. > > Another question, would it not be better to just download the > > headers for the packages install on a PC that one would be doing an > > update on? > > This won't work. yum checks across ALL packages for dependencies and > conflicts. This is fairly complex, as a package may need another > package or be needed BY another package, all recursively, until all > dependencies are resolved. So it isn't possible to predict ahead of > time which package headers are needed. One reason that yum functions so > fast is BECAUSE it has a local copy of all of the headers. > umm - yum only downloads the package headers that are either: 1. not already in the cache or 2. not already an installed package. it never downloads what it already has. -sv