Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, 18 May 2021 10:19:25 -0400
Harry Wentland <harry.wentland@xxxxxxx> wrote:

> On 2021-05-18 3:56 a.m., Pekka Paalanen wrote:
> > On Mon, 17 May 2021 15:39:03 -0400
> > Vitaly Prosyak <vitaly.prosyak@xxxxxxx> wrote:
> >   
> >> On 2021-05-17 12:48 p.m., Sebastian Wick wrote:  

...

> >>> I suspect that this is not about tone mapping at all. The use cases
> >>> listed always have the display in PQ mode and just assume that no
> >>> content exceeds the PQ limitations. Then you can simply bring all
> >>> content to the color space with a matrix multiplication and then map the
> >>> linear light content somewhere into the PQ range. Tone mapping is
> >>> performed in the display only.  
> > 
> > The use cases do use the word "desktop" though. Harry, could you expand
> > on this, are you seeking a design that is good for generic desktop
> > compositors too, or one that is more tailored to "embedded" video
> > player systems taking the most advantage of (potentially
> > fixed-function) hardware?
> >   
> 
> The goal is to enable this on a generic desktop, such as generic Wayland
> implementations or ChromeOS. We're not looking for a custom solution for
> some embedded systems, though the solution we end up with should obviously
> not prevent an implementation on embedded video players.

(There is a TL;DR: at the end.)

Echoing a little bit what Sebastian already said, I believe there are
two sides to this again:
- color management in the traditional sense
- modern standardised display technology

It was perhaps too harsh to say that generic Wayland compositors cannot
use enum-based color-related UAPI. Sometimes they could, sometimes it
won't be good enough.

Traditional color management assumes that no two monitors are the same,
even if they are the same make, model, and manufacturing batch, and are
driven exactly the same way. Hence, all monitors may require
calibration (adjusting monitor knobs), and/or they may require
profiling (measuring the light emission with a special hardware device
designed for that). Also the viewing environment has an effect.

For profiling to be at all meaningful, calibration must be fixed. This
means that there must be no dynamic on-the-fly adaptation done in the
monitor, in the display hardware, or in the kernel. That is a tall
order that I guess is going to be less and less achievable, especially
with HDR monitors.

The other side is where the end user trusts the standards, and trusts
that the drivers and the hardware do what they are specified to do.
This is where you can trust that the monitor does the tone-mapping magic
right.

Weston needs to support both approaches, because we want to prove our
new approach to traditional color management, but we also want to
support HDR, and if possible, do both at the same time. Doing both at
the same time is what we think foremost, because it's also the hardest
thing to achieve. If that can be done, then everything else works out
too.

However, this should not exclude the possibility to trust standards and
monitor magic, when the end user wants it.

It's also possible that a monitor simply doesn't support a mode that
would enable fully color managed HDR, so Weston will need to be able to
drive monitors with e.g. BT.2020/PQ data eventually. It's just not the
first goal we have.

This debate is a little bit ironic. The Wayland approach to traditional
color management is that end users should trust the display server to
do the right thing, where before people only trusted the individual
apps using a specific CMS implementation. The display server was the
untrusted one that should just get out of the way and not touch
anything. Now I'm arguing that I don't want to trust monitor magic, who
knows what atrocities it does to my picture! But take the next logical
step, and one would be arguing that end users should trust also
monitors to do the right thing. :-)

The above has two catches:

- Do you actually trust hardware manufacturers and marketers and EDID?
  Monitors have secret sauce you can't inspect nor change.

- You feed a single video stream to a monitor, in a single format,
  encoding and color space. The display server OTOH gets an arbitrary
  number of input video streams in arbitrary formats, encodings, and
  color spaces, and it needs to composite them into one.

Composition is hard. It's not enough to know what kind of signals you
take in and what kind of signal you must output. You also need to know
what the end user wants from the result: the render intent.

Even if we trust the monitor magic to do the right thing in
interpreting and displaying our output signal, we still need to know
what the end user wants from the composition, and we need to control
the composition formula to achieve that.

TL;DR:

I would summarise my comments so far into these:

- Telling the kernel the color spaces and letting it come up with
  whatever color transformation formula from those is not enough,
  because it puts the render intent policy decision in the kernel.

- Telling the kernel what color transformations need to be done is
  good, if it is clearly defined.

- Using an enum-based UAPI to tell the kernel what color
  transformations needs to be done (e.g. which EOTF or EOTF^-1 to apply
  at a step in the abstract pipeline) is very likely ok for many
  Wayland compositors in most cases, but may not be sufficient for all
  use cases. Of course, one is always bound by what hardware can do, so
  not a big deal.

- You may need to define mutually exclusive KMS properties (referring
  to my email in another branch of this email tree).

- I'm not sure I (we?) can meaningfully review things like "SDR boost"
  property until we know ourselves how to composite different types of
  content together. Maybe someone else could.

Does this help or raise thoughts?

The work on Weston CM&HDR right now is aiming to get it up to a point
where we can start nicely testing different compositing approaches and
methods and parameters, and I expect that will also feed back into the
Wayland CM&HDR protocol design as well.


Thanks,
pq

Attachment: pgpqn4547AoGV.pgp
Description: OpenPGP digital signature


[Index of Archives]     [Linux DRI Users]     [Linux Intel Graphics]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [XFree86]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [XFree86]
  Powered by Linux