Hi Jason, The extract from the debug log file is given below in the first message. It just repeats those lines every so often. I can't find anything else. Cheers ----- Original Message ----- > From: "Jason Dillaman" <jdillama@xxxxxxxxxx> > To: "Andrei Mikhailovsky" <andrei@xxxxxxxxxx> > Cc: "ceph-users" <ceph-users@xxxxxxx> > Sent: Tuesday, 7 July, 2020 16:33:25 > Subject: Re: Re: Octopus upgrade breaks Ubuntu 18.04 libvirt > On Tue, Jul 7, 2020 at 11:07 AM Andrei Mikhailovsky <andrei@xxxxxxxxxx> wrote: >> >> I've left the virsh pool-list command 'hang' for a while and it did eventually >> get the results back. In about 4 hours! > > Perhaps enable the debug logging of libvirt [1] to determine what it's > spending its time on? > >> root@ais-cloudhost1:/home/andrei# time virsh pool-list >> Name State Autostart >> ------------------------------------------- >> 12ca033f-e673-4060-8db9-909d79650f39 active no >> bcc753c6-e47a-3b7c-904a-fcc1d0a594c5 active no >> cf771bc7-8998-354d-8e10-5564585a3c20 active no >> d8d5ec36-3cb0-39af-8fc6-084a4abd5d28 active no >> >> >> real 234m23.877s >> user 0m0.351s >> sys 0m0.506s >> >> >> >> The second attempt was a mere 2 hours with a bit. >> >> >> root@ais-cloudhost1:/home/andrei# time virsh pool-list >> Name State Autostart >> ------------------------------------------- >> 12ca033f-e673-4060-8db9-909d79650f39 active no >> bcc753c6-e47a-3b7c-904a-fcc1d0a594c5 active no >> cf771bc7-8998-354d-8e10-5564585a3c20 active no >> d8d5ec36-3cb0-39af-8fc6-084a4abd5d28 active no >> >> >> real 148m54.763s >> user 0m0.241s >> sys 0m0.304s >> >> >> >> Am I the only person having these issues with libvirt and Octopus release? >> >> Cheers >> >> ----- Original Message ----- >> > From: "Andrei Mikhailovsky" <andrei@xxxxxxxxxx> >> > To: "ceph-users" <ceph-users@xxxxxxx> >> > Sent: Monday, 6 July, 2020 19:27:25 >> > Subject: Re: Octopus upgrade breaks Ubuntu 18.04 libvirt >> >> > A quick update. >> > >> > I have done a fresh install of the CloudStack host server running Ubuntu 18.04 >> > with the latest updates. I've installed ceph 12.x and connected it to >> > Cloudstack which uses kvm/libvirt/ceph/rbd. The rest of the ceph services >> > (mon,mgr,osd,etc) are all running 15.2.3. Works like a charm. >> > >> > As soon as I've updated the host server to version 15.2.3, Libvirt stopped >> > working. It just hangs without doing much it seems. Common commands like 'virsh >> > pool-list' or 'virsh list' are just hanging. I've strace the process and it >> > just doesn't show any activity. >> > >> > >> > 2020-07-06 18:18:36.930+0000: 3273: info : virEventPollUpdateTimeout:265 : >> > EVENT_POLL_UPDATE_TIMEOUT: timer=993 frequen >> > cy=5000 >> > 2020-07-06 18:18:36.930+0000: 3273: debug : virEventPollUpdateTimeout:282 : Set >> > timer freq=5000 expires=1594059521930 >> > 2020-07-06 18:18:36.930+0000: 3273: debug : virEventPollInterruptLocked:722 : >> > Skip interrupt, 1 140123172218240 >> > 2020-07-06 18:18:36.930+0000: 3273: info : virEventPollUpdateHandle:152 : >> > EVENT_POLL_UPDATE_HANDLE: watch=1004 events=1 >> > 2020-07-06 18:18:36.930+0000: 3273: debug : virEventPollInterruptLocked:722 : >> > Skip interrupt, 1 140123172218240 >> > 2020-07-06 18:18:36.930+0000: 3273: debug : virEventPollCleanupTimeouts:525 : >> > Cleanup 8 >> > 2020-07-06 18:18:36.930+0000: 3273: debug : virEventPollCleanupHandles:574 : >> > Cleanup 22 >> > 2020-07-06 18:18:36.930+0000: 3273: debug : virEventRunDefaultImpl:324 : running >> > default event implementation >> > 2020-07-06 18:18:36.930+0000: 3273: debug : virEventPollCleanupTimeouts:525 : >> > Cleanup 8 >> > 2020-07-06 18:18:36.930+0000: 3273: debug : virEventPollCleanupHandles:574 : >> > Cleanup 22 >> > 2020-07-06 18:18:36.931+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=0 w=1, f=5 e=1 d=0 >> > 2020-07-06 18:18:36.931+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=1 w=2, f=7 e=1 d=0 >> > 2020-07-06 18:18:36.931+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=2 w=3, f=10 e=1 d=0 >> > 2020-07-06 18:18:36.931+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=3 w=4, f=11 e=1 d=0 >> > 2020-07-06 18:18:36.931+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=4 w=5, f=12 e=1 d=0 >> > 2020-07-06 18:18:36.931+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=5 w=6, f=13 e=1 d=0 >> > 2020-07-06 18:18:36.931+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=6 w=7, f=14 e=1 d=0 >> > 2020-07-06 18:18:36.931+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=7 w=8, f=15 e=1 d=0 >> > 2020-07-06 18:18:36.931+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=8 w=9, f=16 e=1 d=0 >> > 2020-07-06 18:18:36.931+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=9 w=10, f=17 e=1 d=0 >> > 2020-07-06 18:18:36.931+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=10 w=11, f=18 e=1 d=0 >> > >> > 2020-07-06 18:18:36.932+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=11 w=12, f=19 e=0 d=0 >> > 2020-07-06 18:18:36.932+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=12 w=13, f=19 e=1 d=0 >> > 2020-07-06 18:18:36.932+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=13 w=14, f=24 e=1 d=0 >> > 2020-07-06 18:18:36.932+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=14 w=15, f=25 e=1 d=0 >> > 2020-07-06 18:18:36.932+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=15 w=19, f=26 e=1 d=0 >> > 2020-07-06 18:18:36.932+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=16 w=82, f=79 e=1 d=0 >> > 2020-07-06 18:18:36.932+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=17 w=944, f=22 e=1 d=0 >> > 2020-07-06 18:18:36.932+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=18 w=993, f=82 e=1 d=0 >> > 2020-07-06 18:18:36.932+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=19 w=1001, f=30 e=1 d=0 >> > 2020-07-06 18:18:36.932+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=20 w=1002, f=33 e=1 d=0 >> > 2020-07-06 18:18:36.932+0000: 3273: debug : virEventPollMakePollFDs:401 : >> > Prepare n=21 w=1004, f=83 e=1 d=0 >> > 2020-07-06 18:18:36.933+0000: 3273: debug : virEventPollCalculateTimeout:338 : >> > Calculate expiry of 8 timers >> > 2020-07-06 18:18:36.933+0000: 3273: debug : virEventPollCalculateTimeout:346 : >> > Got a timeout scheduled for 1594059521930 >> > 2020-07-06 18:18:36.933+0000: 3273: debug : virEventPollCalculateTimeout:359 : >> > Schedule timeout then=1594059521930 now=1594059516933 >> > 2020-07-06 18:18:36.933+0000: 3273: debug : virEventPollCalculateTimeout:369 : >> > Timeout at 1594059521930 due in 4997 ms >> > 2020-07-06 18:18:36.933+0000: 3273: info : virEventPollRunOnce:640 : >> > EVENT_POLL_RUN: nhandles=21 timeout=4997 >> > >> > >> > The ceph itself seems to work, i.e. I can execute ceph -s / rbd -p <pool> ls -l, >> > etc. That produces the output. It's just the libvirt seems to be no joy. >> > >> > The version of libvirt installed is: >> > >> > libvirt-bin 4.0.0-1ubuntu8.1 >> > >> > >> > >> > Any idea how I can make ceph Octopus to play nicely with libvirt? >> > >> > Cheers >> > >> > Andrei >> > >> > ----- Original Message ----- >> >> From: "Andrei Mikhailovsky" <andrei@xxxxxxxxxx> >> >> To: "ceph-users" <ceph-users@xxxxxxx> >> >> Sent: Monday, 29 June, 2020 20:40:01 >> >> Subject: Octopus upgrade breaks Ubuntu 18.04 libvirt >> > >> >> Hello, >> >> >> >> I've upgraded ceph to Octopus (15.2.3 from repo) on one of the Ubuntu 18.04 host >> >> servers. The update caused problem with libvirtd which hangs when it tries to >> >> access the storage pools. The problem doesn't exist on Nautilus. The libvirtd >> >> process simply hangs. Nothing seem to happen. The log file for the libvirtd >> >> shows: >> >> >> >> 2020-06-29 19:30:51.556+0000: 12040: debug : virNetlinkEventCallback:707 : >> >> dispatching to max 0 clients, called from event watch 11 >> >> 2020-06-29 19:30:51.556+0000: 12040: debug : virNetlinkEventCallback:720 : event >> >> not handled. >> >> 2020-06-29 19:30:51.556+0000: 12040: debug : virNetlinkEventCallback:707 : >> >> dispatching to max 0 clients, called from event watch 11 >> >> 2020-06-29 19:30:51.556+0000: 12040: debug : virNetlinkEventCallback:720 : event >> >> not handled. >> >> 2020-06-29 19:30:51.557+0000: 12040: debug : virNetlinkEventCallback:707 : >> >> dispatching to max 0 clients, called from event watch 11 >> >> 2020-06-29 19:30:51.557+0000: 12040: debug : virNetlinkEventCallback:720 : event >> >> not handled. >> >> 2020-06-29 19:30:51.591+0000: 12040: debug : virNetlinkEventCallback:707 : >> >> dispatching to max 0 clients, called from event watch 11 >> >> 2020-06-29 19:30:51.591+0000: 12040: debug : virNetlinkEventCallback:720 : event >> >> not handled. >> >> >> >> Running strace on the libvirtd process shows: >> >> >> >> root@ais-cloudhost1:/home/andrei# strace -p 12040 >> >> strace: Process 12040 attached >> >> restart_syscall(<... resuming interrupted poll ...> >> >> >> >> >> >> Nothing happens after that point. >> >> >> >> The same host server can get access to the ceph cluster and the pools by running >> >> ceph -s or rbd -p <pool> ls -l commands for example. >> >> >> >> Need some help to get the host servers working again with Octopus. >> >> >> >> Cheers >> >> _______________________________________________ >> >> ceph-users mailing list -- ceph-users@xxxxxxx >> >> To unsubscribe send an email to ceph-users-leave@xxxxxxx >> > _______________________________________________ >> > ceph-users mailing list -- ceph-users@xxxxxxx >> > To unsubscribe send an email to ceph-users-leave@xxxxxxx >> _______________________________________________ >> ceph-users mailing list -- ceph-users@xxxxxxx >> To unsubscribe send an email to ceph-users-leave@xxxxxxx >> > > [1] https://wiki.libvirt.org/page/DebugLogs > > -- > Jason _______________________________________________ ceph-users mailing list -- ceph-users@xxxxxxx To unsubscribe send an email to ceph-users-leave@xxxxxxx