pol min_size

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hello,
I have an ec pool set as 6+2. I have noticed that when rebooting servers during system upgrades, I get pgs set to inactive while the osds are down. I then discovered that my min_size for the pool is set to 7, which makes sense that if I reboot two servers that host a pg with OSDs on both servers, then only 6 of the OSDs are available during the reboot cycle, and with a min_size of 7, they go inactive.
Is is ok for me to set min_size on the pool to 6, to avoid the inactive problem? I know I could do my reboots sequentially to eliminate multiple server downtime, but wanted to be sure the min_size 6 is ok. I know this may increase other risks, but wanted to know if this min_size change is an option, albeit more risky. Thanks.

-Chris
_______________________________________________
ceph-users mailing list -- ceph-users@xxxxxxx
To unsubscribe send an email to ceph-users-leave@xxxxxxx



[Index of Archives]     [Information on CEPH]     [Linux Filesystem Development]     [Ceph Development]     [Ceph Large]     [Ceph Dev]     [Linux USB Development]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [xfs]


  Powered by Linux