I'm getting yum going and so far it's keeping a hundred or so machines updated without any problems at all. I even have a nice setup where I can stage updates to test machines before I roll them out to everyone. But I have a few questions: --- I've seen folks structure their stock RPM repositories as "base" and "updates". Is there some special reason to do it this way instead of keeping "base" updated constantly? I install new machines from the same repository and I certainly want them to get the proper packages at install time rather than having to install them and then update them. --- What's the best way to maintain a set of "default packages" that can change over time? Occasionally I need to add a package to the default set, and when I do this I need all of the machines on the network to install it automatically. I know I can set up yumgroups.xml and do a "yum groupinstall blah", but that just installs what's in that group at that time. If I add a package to the group I don't think the regular nightly update will grab it. So far I see two options: modify /etc/cron.daily/yum to do the groupinstall every night, or make a package that has nothing but dependencies and install it. Then when I add a package I can update this RPM and the nightly update will take care of everything during dependency resolution. Is there a simpler way? --- Has anyone tried keeping laptops updated? The problem I see is that they won't always be on the network, but I don't know how badly things will blow up when that happens. I also can't do IP restrictions on the server and I'm not sure if I can do any kind of secure or authenticated HTTP to get the packages. Thanks for yum and any info you might provide, -- Jason L Tibbitts III - tibbs@xxxxxxxxxxx - 713/743-3486 - 660PGH - 94 PC800 System Manager: University of Houston Department of Mathematics