Image corrupt after restoring snapshot via Proxmox

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

We have had a situation three times where rbd images seem to be corrupt after restoring a snapshot, and I'm looking for advice on how to investigate this.

We're running Proxmox 7 with Ceph Octopus (Proxmox build, 15.2.17-pve1). Every time the problem has happened, it has happened after these actions were done with the VM:

(Yesterday)
- VM stopped
- Snapshot created
- VM started
- VM stopped
- Snapshot restored
- VM started (OK)
- Nightly backup with vzdump to Proxmox Backup Server
(Today)
- VM stopped
- Snapshot restored
- VM does not start

On previous occasions we tried to find a solution and when we couldn't, we restored the VM from backup, which solved the problem. Now this happened to a test system, so we've left the situation as is and maybe get to the root cause.

Some observations:

  *   We're using krbd
  *   The PBS backups don't allow file restore if the backup was made from a "broken" image
  *   After mapping the current image, it doesn't seem to contain a partition table

There's a thread on the Proxmox forum about this issues as well[1].

If anyone could give some advice about how to proceed from here, I'd be very grateful.

Best regards,

Roel

PS: An upgrade to Pacific has already been planned.

[1] https://forum.proxmox.com/threads/vm-disks-corrupt-after-reverting-to-snapshot.94698/

--
Wij zijn ISO 27001 gecertificeerd

1A First Alternative BV
T: +31 (0)88 0016405
W: https://www.1afa.com

_______________________________________________
ceph-users mailing list -- ceph-users@xxxxxxx
To unsubscribe send an email to ceph-users-leave@xxxxxxx



[Index of Archives]     [Information on CEPH]     [Linux Filesystem Development]     [Ceph Development]     [Ceph Large]     [Ceph Dev]     [Linux USB Development]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [xfs]


  Powered by Linux