On Fri, 14 May 2021 17:04:51 -0400 Harry Wentland <harry.wentland@xxxxxxx> wrote: > On 2021-04-30 8:53 p.m., Sebastian Wick wrote: > > On 2021-04-26 20:56, Harry Wentland wrote: ... > >> Another reason I'm proposing to define the color space (and gamma) of > >> a plane is to make this explicit. Up until the color space and gamma > >> of a plane or framebuffer are not well defined, which leads to drivers > >> assuming the color space and gamma of a buffer (for blending and other > >> purposes) and might lead to sub-optimal outcomes. > > > > Blending only is "correct" with linear light so that property of the > > color space is important. However, why does the kernel have to be > > involved here? As long as user space knows that for correct blending the > > data must represent linear light and knows when in the pipeline blending > > happens it can make sure that the data at that point in the pipeline > > contains linear light. > > > > The only reason the kernel needs to be involved is to take full advantage > of the available HW without requiring KMS clients to be aware of > the difference in display HW. Can you explain in more tangible examples, why you think so, please? Is it because hardware does not fit the KMS UAPI model of the abstract pixel pipeline? Or is it because you have fixed-function hardware elements that you can only make use of when userspace uses an enum-based UAPI? I would totally agree that the driver does not want to be analysing LUT entries to decipher if it could use a fixed-function element or not. It would introduce uncertainty in the UAPI. So fixed-function elements would need their own properties, but I don't know if that is feasible as generic UAPI or if it should be driver-specific (and so left unused by generic userspace). Thanks, pq
Attachment:
pgpctAMVZN_gV.pgp
Description: OpenPGP digital signature
_______________________________________________ amd-gfx mailing list amd-gfx@xxxxxxxxxxxxxxxxxxxxx https://lists.freedesktop.org/mailman/listinfo/amd-gfx