(crossposted to linux-raid@xxxxxxxxxxxxxxx and redhat-list@xxxxxxxxxx)
(apologies for this, but this should have be operational last week)
I installed Red Hat EL AS 4 on a HP Proliant DL380, and configured all
system devices in software RAID 1. I added an entry to grub.conf to
fallback to the second disk in case the first entry fails. At boottime,
booting from hd0 works fine. As does booting from hd1.
Until I physically remove hd0 from the system.
I tried manually installing grub on hd1,
I added hd1 to the device.map and subsequently re-installed grub on it,
I remapped hd0 to /dev/cciss/c0d1 and subsequently re-installed grub
all to no avail.
I previously installed this while the devices were in slots 2 and 3.
The system wouldn't even boot then. It looks as though booting from sw
RAID1 will only work when there's a valid device in slot 0. Still
preferable over hw RAID1, but even better would be if this worked all
the way.
Is this working for anyone? Any idea what I may have overlooked? Any
suggestions on how to debug this?
Kind regards,
Herta
Disclaimer: http://www.kuleuven.be/cwis/email_disclaimer.htm
--
redhat-list mailing list
unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe
https://www.redhat.com/mailman/listinfo/redhat-list