Raid 1 usage in a lab

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I have a box with a two discs mirrored. Suppose I want to remove the primary
and boot off the secondary alone to test, then roll back to the primary after
the test.

I have been doing this by:

1. Removing sda, booting off the sdb and running:
     # mdadm --grow /dev/md[0|1] --raid-devices=1 --force
     Then test away, shutdown...

2. Remove sdb, install sda only and run:
     # mdadm --grow /dev/md[0|1] --raid-devices=1 --force
     shutdown...

3. Install sdb and reboot, the run:
    # mdadm --manage /dev/md0 --add /dev/sdb1
    # mdadm --grow /dev/md0 --raid-devices=2
 
    # mdadm --manage /dev/md1 --add /dev/sdb2
    # mdadm --grow /dev/md1 --raid-devices=2


Is there a simpler way with less reboots? What criteria does md use to
determine which disc is most recent, even a solo reboot with sda, after
which attaching sdb causes the system to prefer sdb, even though the solo
boot off sda should have made timestamps prove sda most current.

Thanks!
jlc
--
To unsubscribe from this list: send the line "unsubscribe linux-raid" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux RAID Wiki]     [ATA RAID]     [Linux SCSI Target Infrastructure]     [Linux Block]     [Linux IDE]     [Linux SCSI]     [Linux Hams]     [Device Mapper]     [Device Mapper Cryptographics]     [Kernel]     [Linux Admin]     [Linux Net]     [GFS]     [RPM]     [git]     [Yosemite Forum]


  Powered by Linux