I was creating a new user and mount point. On another hardware node I mounted CephFS as admin to mount as root. I created /aufstest and then unmounted. From there it seems that both of my mds nodes crashed for some reason and I can't start them any more.
https://pastebin.com/1ZgkL9fa -- my mds log
I have never had this happen in my tests so now I have live data here. If anyone can lend a hand or point me in the right direction while troubleshooting that would be a godsend!
I tried cephfs-journal-tool inspect and it reports that the journal should be fine. I am not sure why it's crashing:
https://pastebin.com/1ZgkL9fa -- my mds log
I have never had this happen in my tests so now I have live data here. If anyone can lend a hand or point me in the right direction while troubleshooting that would be a godsend!
I tried cephfs-journal-tool inspect and it reports that the journal should be fine. I am not sure why it's crashing:
/home/lacadmin# cephfs-journal-tool journal inspect
Overall journal integrity: OK
_______________________________________________ ceph-users mailing list ceph-users@xxxxxxxxxxxxxx http://lists.ceph.com/listinfo.cgi/ceph-users-ceph.com