glxgears frame rate when DPMS is in "off" state

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




Hi everyone,

With relatively recent versions of AMD/ATI DDX (xf86-video-ati library), I have noticed a behavior related to DPMS that looks incorrect to me.

Namely, if I run glxgears, the reported frame rate is equal to that of the monitor refresh rate, which is correct. Now if I enter DPMS "off" state, wait a while, and then exit it (back to DPMS "on"), I see that while in "off" mode the frame rate was up in the thousands. Consequently, the CPU utilization went up to 100% (split about 50%/50% between X and and glxgears process).

I have traced the problem to DDX and here are some findings. Now, thinking about how to fix it (elaborated later), I realize that I am not sure what would be the conceptually correct thing to do to fix this.

Here is how the problem happens:

* Screen is put into DPMS "off" mode.

* The application requests the buffer-swap and X eventually ends up in radeon_dri2_schedule_swap.

* radeon_dri2_schedule_swap tries to determine the CRTC by calling radeon_dri2_drawable_crtc which further leads into radeon_pick_best_crtc

* In radeon_pick_best_crtc, no CRTC is found because CRTCs are either unused by the affected drawable or the only right candidate CRTC is skipped by this check (radeon_crtc_is_enabled looks explicitly at DPMS state):

	if (!radeon_crtc_is_enabled(crtc))
	    continue;

* Consequently, radeon_pick_best_crtc returns -1 to radeon_dri2_schedule_swap, which decides that it can't do the vblank
wait and jumps to blit_fallback label.

* blit_fallback does its thing, achieving the effect of swap, but now there is no pacing. It returns immediatelly and application proceeds with rendering the next frame without any pause.

* As a consequence, we get a glxgears and X to run at maximum speed allowed by the CPU and GPU combined.

Now, the reason DPMS exists is to conserve power, but it doesn't make much sense to conserve power through monitor shutoff if we will eat up much more power by thrashing the processor and the GPU.

One quick fix that came into my mind is to replace the 'if' in radeon_pick_best_crtc with something like this:

	if (!crtc->enabled)
	    continue;

(whether by design or by accident, crtc in DPMS "off" state is still enabled as far as that flag is concerned). However, that will introduce the regression with regard to this bug:

https://bugs.freedesktop.org/show_bug.cgi?id=49761

(which is the reason the above check was originally added).

Another possibility would be to enforce some maximum rate per-drawable (using sleep for example) when radeon_dri2_schedule_swap decides to take the blit_fallback path. However, I don't personally like it and I have a gut feeling that sleeping in shedule_swap would probably do harm somewhere else. Also, there may be applications that would want to render an animated scene off-screen at maximum speed (e.g., off-line rendering) and
radeon_dri2_schedule_swap has no way of telling whether crtc is -1 because
the application wants it that way or because the associated crtc is in
power-savings mode.

Clearly, the behavior that we have now is wrong from the power-savings perspective (i.e., it completely defeats the purpose of DPMS), but before I try to hack up the fix, I would like to hear some suggestions on what the "right" thing to do would be.

thanks,

Ilija

_______________________________________________
dri-devel mailing list
dri-devel@xxxxxxxxxxxxxxxxxxxxx
http://lists.freedesktop.org/mailman/listinfo/dri-devel


[Index of Archives]     [Linux DRI Users]     [Linux Intel Graphics]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [XFree86]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [XFree86]
  Powered by Linux