Re: [RFC PATCH v3 1/6] drm/doc: Color Management and HDR10 RFC

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




On 2021-09-15 10:36, Pekka Paalanen wrote:
> On Mon, 16 Aug 2021 15:37:23 +0200
> sebastian@xxxxxxxxxxxxxxxxx wrote:
> 
>> On 2021-08-16 14:40, Harry Wentland wrote:
>>> On 2021-08-16 7:10 a.m., Brian Starkey wrote:  
>>>> On Fri, Aug 13, 2021 at 10:42:12AM +0530, Sharma, Shashank wrote:  
>>>>> Hello Brian,
>>>>> (+Uma in cc)
>>>>>
>>>>> Thanks for your comments, Let me try to fill-in for Harry to keep the 
>>>>> design
>>>>> discussion going. Please find my comments inline.
>>>>>   
>>>
>>> Thanks, Shashank. I'm back at work now. Had to cut my trip short
>>> due to rising Covid cases and concern for my kids.
>>>   
>>>>> On 8/2/2021 10:00 PM, Brian Starkey wrote:  
>>>>>>   
>>>>
>>>> -- snip --
>>>>   
>>>>>>
>>>>>> Android doesn't blend in linear space, so any API shouldn't be built
>>>>>> around an assumption of linear blending.
>>>>>>   
>>>
>>> This seems incorrect but I guess ultimately the OS is in control of
>>> this. If we want to allow blending in non-linear space with the new
>>> API we would either need to describe the blending space or the
>>> pre/post-blending gamma/de-gamma.
>>>
>>> Any idea if this blending behavior in Android might get changed in
>>> the future?  
>>
>> There is lots of software which blends in sRGB space and designers
>> adjusted to the incorrect blending in a way that the result looks right.
>> Blending in linear space would result in incorrectly looking images.
> 
> Hi,
> 
> yes, and I'm guilty of that too, at least by negligence. ;-)
> 
> All Wayland compositors do it, since that's what everyone has always
> been doing, more or less. It's still physically wrong, but when all you
> have is sRGB and black window shadows and rounded corners as the only
> use case, you don't mind.
> 
> When you start blending with colors other than black (gradients!), when
> you go to wide gamut, or especially with HDR, I believe the problems
> start to become painfully obvious.
> 
> But as long as you're stuck with sRGB only, people expect the "wrong"
> result and deviating from that is a regression.
> 
> Similarly, once Weston starts doing color management and people turn it
> on and install monitor profiles, I expect to get reports saying "all
> old apps look really dull now". That's how sRGB is defined to look
> like, they've been looking at something else for all that time.
> :-)
> 
> Maybe we need a sRGB "gamut boost" similar to SDR luminance boost. ;-)
> 

I wonder how other OSes deal with this change in expectations.

I also have a Chromebook with a nice HDR OLED panel but an OS that
doesn't really do HDR and seems to output to the full gamut
(I could be wrong on this) and luminance range of the display.
It makes content seem really vibrant but I'm equally worried how
users will perceive it if there's ever proper color management.

>>>> I still think that directly exposing the HW blocks and their
>>>> capabilities is the right approach, rather than a "magic" tonemapping
>>>> property.
>>>>
>>>> Yes, userspace would need to have a good understanding of how to use
>>>> that hardware, but if the pipeline model is standardised that's the
>>>> kind of thing a cross-vendor library could handle.
>>>>   
>>>
>>> One problem with cross-vendor libraries is that they might struggle
>>> to really be cross-vendor when it comes to unique HW behavior. Or
>>> they might pick sub-optimal configurations as they're not aware of
>>> the power impact of a configuration. What's an optimal configuration
>>> might differ greatly between different HW.
>>>
>>> We're seeing this problem with "universal" planes as well.  
>>
>> I'm repeating what has been said before but apparently it has to be said
>> again: if a property can't be replicated exactly in a shader the
>> property is useless. If your hardware is so unique that it can't give us
>> the exact formula we expect you cannot expose the property.
> 
> From desktop perspective, yes, but I'm nowadays less adamant about it.
> If kernel developers are happy to maintain multiple alternative UAPIs,
> then I'm not going to try to NAK that - I'll just say when I can and
> cannot make use of them. Also everything is always up to some
> precision, and ultimately here it is a question of whether people can
> see the difference.
> 
> Entertainment end user audience is also much more forgiving than
> professional color management audience. For the latter, I'd hesitate to
> use non-primary KMS planes at all.
> 
>> Either way if the fixed KMS pixel pipeline is not sufficient to expose
>> the intricacies of real hardware the right move would be to make the KMS
>> pixel pipeline more dynamic, expose more hardware specifics and create a
>> hardware specific user space like mesa. Moving the whole compositing
>> with all its policies and decision making into the kernel is exactly the
>> wrong way to go.
>>
>> Laurent Pinchart put this very well:
>> https://lists.freedesktop.org/archives/dri-devel/2021-June/311689.html
> 
> Thanks for digging that up, saved me the trouble. :-)
> 

Really good summary. I can see the parallel to the camera subsystem. Maybe
now is a good time for libdisplay, or a "mesa" for display HW.

Btw, I fully agree on the need to have clear ground rules (like the newly
formalized requirement for driver properties) to keep this from becoming an
unmaintainable mess.

Harry

> 
> Thanks,
> pq
> 




[Index of Archives]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]

  Powered by Linux