Re: How to remove non-existant device

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 23/09/2010 08:30, Benjamin Schieder wrote:
Hi all.

On a SLES 10 SP 3 x86_64 version I'm encountering the following problem:

Two disks sda and sdb are in a RAID 1 configuration. When pulling one
disk and replacing it it becomes sdc instead of sda. The RAID is now
degraded:

kblhbe101:~ # cat /proc/mdstat
Personalities : [raid1] [raid0] [raid5] [raid4] [linear]
md2 : active raid1 sda8[2](F) sdb8[1]
       286125568 blocks [2/1] [_U]

md0 : active raid1 sda5[2](F) sdb5[1]
       529984 blocks [2/1] [_U]

md3 : active raid1 sda7[2](F) sdb7[1]
       4200896 blocks [2/1] [_U]

md1 : active raid1 sda6[2](F) sdb6[1]
       2104384 blocks [2/1] [_U]

unused devices:<none>

But now I can't remove the sda partitions from the RAID:
kblhbe101:~ # mdadm /dev/md0 -r /dev/sda5
mdadm: cannot find /dev/sda5: No such file or directory
kblhbe101:~ # mdadm /dev/md0 -r sda5
mdadm: cannot find sda5: No such file or directory

What am I doing wrong here?
kblhbe101:~ # mdadm --version
mdadm - v2.6 - 21 December 2006

Try `mdadm /dev/md0 -r missing`.

Cheers,

John.

--
To unsubscribe from this list: send the line "unsubscribe linux-raid" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux RAID Wiki]     [ATA RAID]     [Linux SCSI Target Infrastructure]     [Linux Block]     [Linux IDE]     [Linux SCSI]     [Linux Hams]     [Device Mapper]     [Device Mapper Cryptographics]     [Kernel]     [Linux Admin]     [Linux Net]     [GFS]     [RPM]     [git]     [Yosemite Forum]


  Powered by Linux