Re: [EXTERN] HW de/encoding & Chroma supsampling

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 





On 29.07.24 13:56, Michael Scherle wrote:
Hello,

with the hopefully soon merged pull request: “[PATCH v8 0/6] dcc: Create a stream for non-gl/remote clients that want to use dmabuf (v8)” (which we urgently need for our project), I have noticed the problem of chroma supsampling. The problem usually occurs with colored text and thin lines. Which of course is bad in a remote desktop scenario.

The attached screenshot shows a test image of rtings on chroma supsampling, whereby it should be noted that the image should either be viewed at exactly 100% or greatly enlarged, as otherwise artifacts may occur that are not present in the original.

The left third shows the current hardware encoding implementation, which uses Chroma Supsampling 4:2:0, and the resulting artifacts that occur especially in blue and red. In the right third you can see a prototype implementation of 4:4:4 hardware de/encoding that produces a perfect image. In the center is a prototype of 4:2:0 encoding, where the image is upsampled to double before encoding and downsampled after decoding to remove the artifacts. The picture does not show the artifacts, but it is a bit blurry due to the interpolation of the up/downsampling of the Gstreamer plugins. I have modified the Gstreamer plugins so that it is possible to set the interpolation method to Nearest Neighbor, then the blurring is gone and the picture is the same as 4:4:4 (but of course it has the disadvantage of higher bandwidths and en/decoder load). Nevertheless, in some cases it could be a fallback for hardware that cannot work with 4:4:4. You can find the original test image here: https://www.rtings.com/images/test-materials/2017/chroma-444.png if you want to try it out for yourself.


Since I would like to submit my implementation after the above merge request has been merged (since it depends on it), I have a few questions for a reasonable implementation.

At the moment I have implemented everything as a separate format, e.g. for h265:

  SPICE_VIDEO_CODEC_TYPE_H265,
  SPICE_VIDEO_CODEC_TYPE_H265_444,
  SPICE_VIDEO_CODEC_TYPE_H265_U,

and for caps:

  SPICE_DISPLAY_CAP_CODEC_H265,
  SPICE_DISPLAY_CAP_CODEC_H265_444,
  SPICE_DISPLAY_CAP_CODEC_H265_U,

For caps, I think this makes sense, since a HW de/encoder, for example, could only do 4:2:0. But what do you think, does it also make sense for the formats or should it be done via extra parameters?

Where should I place my fork? should it be on the freedesktop gitlab? then I would have to see how to get the appropriate permissions.

Does anyone know of any other methods of avoiding artifacts on hardware that can only work with chroma supsamling?


Greetings
Michael


Hello,

I got the patches for gstreamer merged for the va and msdk plugin for Nearest Neighbor. With them the upsampling looks flawlessly.

Regarding merge requests, are these still accepted at all, for example:
>dcc: Create a stream for non-gl/remote clients that want to use dmabuf (v8)
doesn't seem to be handled anymore?
can you still make merge requestsor are only security vulnerabilities fixed?

Greetings
Michael



[Index of Archives]     [Linux Virtualization]     [Linux Virtualization]     [Linux ARM Kernel]     [Linux ARM]     [Linux Omap]     [Fedora ARM]     [IETF Annouce]     [Security]     [Bugtraq]     [Linux OMAP]     [Linux MIPS]     [ECOS]     [Asterisk Internet PBX]     [Linux API]     [Monitors]