Re: RFC for a render API to support adaptive sync and VRR

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 10.04.2018 19:25, Cyr, Aric wrote:
-----Original Message-----
From: Michel Dänzer [mailto:michel@xxxxxxxxxxx]
Sent: Tuesday, April 10, 2018 13:16

On 2018-04-10 07:13 PM, Cyr, Aric wrote:
-----Original Message-----
From: Michel Dänzer [mailto:michel@xxxxxxxxxxx]
Sent: Tuesday, April 10, 2018 13:06
On 2018-04-10 06:26 PM, Cyr, Aric wrote:
From: Koenig, Christian Sent: Tuesday, April 10, 2018 11:43

For video games we have a similar situation where a frame is rendered
for a certain world time and in the ideal case we would actually
display the frame at this world time.

That seems like it would be a poorly written game that flips like
that, unless they are explicitly trying to throttle the framerate for
some reason.  When a game presents a completed frame, they’d like
that to happen as soon as possible.

What you're describing is what most games have been doing traditionally.
Croteam's research shows that this results in micro-stuttering, because
frames may be presented too early. To avoid that, they want to
explicitly time each presentation as described by Christian.

Yes, I agree completely.  However that's only truly relevant for fixed
refreshed rate displays.

No, it also affects variable refresh; possibly even more in some cases,
because the presentation time is less predictable.

Yes, and that's why you don't want to do it when you have variable refresh.  The hardware in the monitor and GPU will do it for you, so why bother?

I think Michel's point is that the monitor and GPU hardware *cannot* really do this, because there's synchronization with audio to take into account, which the GPU or monitor don't know about.

Also, as I wrote separately, there's the case of synchronizing multiple monitors.


The input to their algorithms will be noisy causing worst estimations.  If you just present as fast as you can, it'll just work (within reason).
The majority of gamers want maximum FPS for their games, and there's quite frequently outrage at a particular game when they are limited to something lower that what their monitor could otherwise support (i.e. I don't want my game limited to 30Hz if I have a shiny 144Hz gaming display I paid good money for).   Of course, there's always exceptions... but in our experience those are few and far between.

I agree that games most likely shouldn't try to be smart. I'm curious about the Croteam findings, but even if they did a really clever thing that works better than just telling the display driver "display ASAP please", chances are that *most* developers won't do that. And they'll most likely get it wrong, so our guidance should really be "games should ask for ASAP presentation, and nothing else".

However, there *are* legitimate use cases for requesting a specific presentation time, and there *is* precedent of APIs that expose such features.

Are there any real problems with exposing an absolute target present time?

Cheers,
Nicolai

_______________________________________________
dri-devel mailing list
dri-devel@xxxxxxxxxxxxxxxxxxxxx
https://lists.freedesktop.org/mailman/listinfo/dri-devel




[Index of Archives]     [Linux DRI Users]     [Linux Intel Graphics]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [XFree86]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [XFree86]
  Powered by Linux