Hello spice developers,
I looked further and found the function get_min_playback_delay(
https://gitlab.freedesktop.org/spice/spice/-/blob/fe1c25f530b95d32cc81bc1a395d80ace631d2dd/server/gstreamer-encoder.c#L507)
which calculates the delay I was looking for. However, some of the
calculations do not make sense to me. net_latency +
get_average_encoding_time seems to be reasonable. However send_time
seems strange to me. I assume that this is to calculate how long a large
(I frame) and a normal frame is encoded. I think that this calculation
is not optimal, the more so as the average encoding time has already
been added and the gstreamer target bitrate is not accurate (especially
for variable bitrates) which makes the calculations less accurate.
As a result, the delay is unnecessarily increased, if the image size
changes rapidly in a short period of time, e.g. due to rapidly changing
image content. Which would also work without a delay increase. Also the
decoding time of the client is not taken into account here. I'm also not
sure how useful it is, to calculate the display delay for the client on
the server. But perhaps I have been mistaken and you can enlighten me.
On 15.03.24 14:08, Michael Scherle wrote:
Hello spice developers,
we are trying to develop an Open Source virtual desktop infrastructure
to be deployed at multiple German universities as described, by my
colleagues, in the paper which I have put in the attachment. The
solution based on openstack, qemu, spice... Our plan is also to have VM
instances with virtual GPUs (SR-IOV). Due to the resulting requirements,
it is necessary to transmit the image data as a video stream.
We have seen Vivek Kasireddy recent work on spice which solves exactly
this problem. However, when we tested it, we noticed a very high input
to display delay (400 ms+ but only if the image data is transferred as
video-stream). However, the problem seems to be a more general spice
problem or is there something wrong with our setup or are there special
parameters that we are missing?
Our setup:
QEMU: https://gitlab.freedesktop.org/Vivek/qemu/-/commits/spice_gl_on_v2
Spice:
https://gitlab.freedesktop.org/Vivek/spice/-/commits/encode_dmabuf_v6
virt-viewer
Intel HW decoder/encoder (but same with sw)
I have looked into what is causing the delay and have noticed that
encoding only takes about 3-4ms. In general, the image seems to reach
the client in less than 15ms.
The main problem seems to be that gstreamer gets a very high
margin(https://gitlab.freedesktop.org/spice/spice-gtk/-/blob/master/src/channel-display.c?ref_type=heads#L1773) and therefore waits a long time before starting decoding. And the reason for the high margin seems to be the bad mm_time_offset https://gitlab.freedesktop.org/spice/spice-gtk/-/blob/master/src/spice-session.c?ref_type=heads#L2418 which is used to offset the server time to the client time (with some margin). And this variable is set by the spice server to initially 400 ms https://gitlab.freedesktop.org/spice/spice/-/blob/master/server/reds.cpp?ref_type=heads#L3062 and gets updated with the latency https://gitlab.freedesktop.org/spice/spice/-/blob/master/server/reds.cpp?ref_type=heads#L2614 (but only increased). I still need to see how this latency is calculated.
Am I missing something or is this design not intended for transmitting
interactive content via video stream?
Temporarily overwriting the margin and tweaking parameter settings on
the msdkh264dec brought the delay to about 80-100ms, which is not yet
optimal but usable. To see what is technical possible on my setup, I
made a comparison using moonlight/sunshine which resulted in an delay of
20-40ms.
Our goal is to achieve some round trip time similar to the
moonlight/sunshine scenario to achieve a properly usable desktop
experience.
Greetings
Michael
Greetings
Michael