/cdrom mounted from initrd is stopped on boot, possibly confused about device-bound

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Ubuntu installer images use initrd, which has udevd but no systemd.

It mounts /dev/sr0 as /root/cdrom, then pivots to /root, meaning
/root/cdrom becomes just /cdrom and exec systemd as pid 1.

At this point cdrom.mount is stopped as it's bound to an inactive
dev-sr0.device. Then sometime later dev-sr0.device becomes active, but
nothing remounts /cdrom back in.

My question is why on startup, when processing cdrom.mount it
determines that dev-sr0 is inactive, when clearly it's fully
operational (it contains media, media is locked, and is mounted, and
is serving content).

I notice that SYSTEMD_MOUNT_DEVICE_BOUND is set to 1 on the udev
device, and it seems impossible to undo via mount unit.

I also wonder why, initially, /dev/sr0 is inactive, but later becomes
active - as in what causes it to become active, and what is missing in
the initrd.

Things appear to work if I specify in the 60-cdrom_id.rules
SYSTEMD_READY=1, then on boot there are no working messages that
cdrom.mount is bound to an inactive device.

Shouldn't 60-cdrom_id.rules set SYSTEMD_READY=1 if after importing
cdrom_id variables ID_CROM_MEDIA is not-empty? Such that
dev-sr0.device initial state is correct, if one booted with cdrom
media in place.

-- 
Regards,

Dimitri.
_______________________________________________
systemd-devel mailing list
systemd-devel@xxxxxxxxxxxxxxxxxxxxx
https://lists.freedesktop.org/mailman/listinfo/systemd-devel




[Index of Archives]     [LARTC]     [Bugtraq]     [Yosemite Forum]     [Photo]

  Powered by Linux