> > So, if it took 3 hours to add 50 gb to a 60 gb array, a rough > > calculation says it will take about 19 hours to add 100 gb > > to a 380 gb array, which is close to your calculations, yet > > still lower. > > My calculations was a very rough estimate - perhaps you will > see a performance degradation when moving to larger > partitions (because average seek time will be slightly > higher), but I'm not sure if that is going to be noticable at all. I did run raidreconf on a 560Gb array (8x80Gb RAID5) adding another disk taking it to 640Gb. Machine was an Athlon 1200Mhz with 512Mb RAM. All disks on diffrent IDE channels. Samsung 5400rpm disks. Estimated runtime was 29 hours, it had been going for about 21h when it crashed because of bad sectors on the new disk (duh!). At that point it was 70%+ complete so the estimate was pretty accurate in this case. Regards, Fredrik Lindgren - To unsubscribe from this list: send the line "unsubscribe linux-raid" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html