Re: Overlay support in the i.MX7 display

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Le lundi 04 novembre 2019 à 10:09 +0200, Pekka Paalanen a écrit :
> On Sun, 03 Nov 2019 19:15:49 +0100
> Stefan Agner <stefan@xxxxxxxx> wrote:
> 
> > Hi Laurent,
> > 
> > On 2019-11-01 09:43, Laurent Pinchart wrote:
> > > Hello,
> > > 
> > > I'm looking at the available options to support overlays in the display
> > > pipeline of the i.MX7. The LCDIF itself unfortunaltey doesn't support
> > > overlays, the feature being implemented in the PXP. A driver for the PXP
> > > is available but only supports older SoCs whose PXP doesn't support
> > > overlays. This driver is implemented as a V4L2 mem2mem driver, which
> > > makes support of additional input channels impossible.  
> > 
> > Thanks for bringing this up, it is a topic I have wondered too:
> > Interaction between PXP and mxsfb.
> > 
> > I am not very familiar with the V4L2 subsystem so take my opinions with
> > a grain of salt.
> > 
> > > Here are the options I can envision:
> > > 
> > > - Extend the existing PXP driver to support multiple channels. This is
> > >   technically feasible, but will require moving away from the V4L2
> > >   mem2mem framework, which would break userspace. I don't think this
> > >   path could lead anywhere.
> > > 
> > > - Write a new PXP driver for the i.MX7, still using V4L2, but with
> > >   multiple video nodes. This would allow blending multiple layers, but
> > >   would require writing the output to memory, while the PXP has support
> > >   for direct connections to the LCDIF (through small SRAM buffers).
> > >   Performances would thus be suboptimal. The API would also be awkward,
> > >   as using the PXP for display would require usage of V4L2 in
> > >   applications.  
> > 
> > So the video nodes would be sinks? I would expect overlays to be usable
> > through KMS, I guess that would then not work, correct?
> > 
> > > - Extend the mxsfb driver with PXP support, and expose the PXP inputs as
> > >   KMS planes. The PXP would only be used when available, and would be
> > >   transparent to applications. This would however prevent using it
> > >   separately from the display (to perform multi-pass alpha blending for
> > >   instance).  
> > 
> > KMS planes are well defined and are well integrated with the KMS API, so
> > I prefer this option. But is this compatible with the currently
> > supported video use-case? E.g. could we make PXP available through V4L2
> > and through DRM/mxsfb?
> > 
> > Not sure what your use case is exactly, but when playing a video I
> > wonder where is the higher value using PXP: Color conversion and scaling
> > or compositing...? I would expect higher value in the former use case.
> 
> Hi,
> 
> mind, with Wayland architecture, color conversion and scaling could be
> at the same level/step as compositing, in the display server instead of
> an application. Hence if the PXP capabilities were advertised as KMS
> planes, there should be nothing to patch in Wayland-designed
> applications to make use of them, assuming the applications did not
> already rely on V4L2 M2M devices.

The PXP can already be used with GStreamer v4l2convert element, for CSC
and scaling.

> 
> Would it not be possible to expose PXP through both uAPI interfaces? At
> least KMS atomic's TEST_ONLY feature would make it easy to say "no" to
> userspace if another bit of userspace already reserved the device via
> e.g. V4L2.

Same exist for decoders with fixed number of streams/instances I think.

> 
> 
> Thanks,
> pq

_______________________________________________
dri-devel mailing list
dri-devel@xxxxxxxxxxxxxxxxxxxxx
https://lists.freedesktop.org/mailman/listinfo/dri-devel




[Index of Archives]     [Linux DRI Users]     [Linux Intel Graphics]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [XFree86]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [XFree86]
  Powered by Linux