Hello, I got a handful of servers handed over to me and I noticed one of them (on boot up) was complaining about /dev/sdc having bad blocks and other various errors. So I thought I would umount it and fsck etc..... Come to find out this disk is no longer attached to the box, it currently has an sda(1-8) and a sdb(1) plus one sdd SAN mount. I ran fdisk -l to take a peek (and sure enough the drive doesn't exist (but it complains here as well). I've also checked my /etc/fstab and nothing exists there either. I've also grep'd the system to see if there was some sort of script (via cron etc) trying to mount a /dev/sdc or something at boot. Q. How do you properly remove a drive 100% from the system (obviously it's still trying to use this non-existent device). I thought maybe just using "rm" on the device file in question would work but I didn't try it yet. Q. Is there a set of tools/script/command for removing non-existent drives? PS. I've seen this similar situation before where the system was trying to mount stale (non-existent) drives from an old SAN setup, but I can't remember how I fixed it. :( I appreciate any advice. Thank you. -- redhat-list mailing list unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe https://www.redhat.com/mailman/listinfo/redhat-list