When a frame is late we schedule its display right away with g_timeout_add(0, ...). This can result in display_frame() being called before g_timeout_add() returns. This would cause the timer_id being reset before schedule_frame() had set it so that it would then never be reset. So from that point schedule_frame() would always think a frame was being displayed and thus would not schedule any more frames resulting in a video freeze. display_frame() now takes the queues mutex before resetting timer_id eliminating the race. Signed-off-by: Francois Gouget <fgouget@xxxxxxxxxxxxxxx> --- src/channel-display-gst.c | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/src/channel-display-gst.c b/src/channel-display-gst.c index 2b4ef7f9..8be95997 100644 --- a/src/channel-display-gst.c +++ b/src/channel-display-gst.c @@ -127,9 +127,8 @@ static gboolean display_frame(gpointer video_decoder) GstBuffer *buffer; GstMapInfo mapinfo; - decoder->timer_id = 0; - g_mutex_lock(&decoder->queues_mutex); + decoder->timer_id = 0; frame = g_queue_pop_head(decoder->display_queue); g_mutex_unlock(&decoder->queues_mutex); /* If the queue is empty we don't even need to reschedule */ -- 2.11.0 _______________________________________________ Spice-devel mailing list Spice-devel@xxxxxxxxxxxxxxxxxxxxx https://lists.freedesktop.org/mailman/listinfo/spice-devel