Hello Sean, On Mon, Apr 30, 2018 at 2:32 PM, Sean Sullivan <lookcrabs@xxxxxxxxx> wrote: > I was creating a new user and mount point. On another hardware node I > mounted CephFS as admin to mount as root. I created /aufstest and then > unmounted. From there it seems that both of my mds nodes crashed for some > reason and I can't start them any more. > > https://pastebin.com/1ZgkL9fa -- my mds log > > I have never had this happen in my tests so now I have live data here. If > anyone can lend a hand or point me in the right direction while > troubleshooting that would be a godsend! Thanks for keeping the list apprised of your efforts. Since this is so easily reproduced for you, I would suggest that you next get higher debug logs (debug_mds=20/debug_ms=1) from the MDS. And, since this is a segmentation fault, a backtrace with debug symbols from gdb would also be helpful. -- Patrick Donnelly _______________________________________________ ceph-users mailing list ceph-users@xxxxxxxxxxxxxx http://lists.ceph.com/listinfo.cgi/ceph-users-ceph.com