On 2/16/19 10:42 AM, Hans Verkuil wrote: > On 2/16/19 1:16 AM, Tim Harvey wrote: >> Greetings, >> >> What is needed to be able to take advantage of hardware video >> composing capabilities and make them available in something like >> GStreamer? > > Are you talking about what is needed in a driver or what is needed in > gstreamer? Or both? > > In any case, the driver needs to support the V4L2 selection API, specifically > the compose target rectangle for the video capture. I forgot to mention that the driver should allow the compose rectangle to be anywhere within the bounding rectangle as set by S_FMT(CAPTURE). In addition, this also means that the DMA has to be able to do scatter-gather, which I believe is not the case for the imx m2m hardware. Regards, Hans > > Regards, > > Hans > >> >> Philipp's mem2mem driver [1] exposes the IMX IC and GStreamer's >> v4l2convert element uses this nicely for hardware accelerated >> scaling/csc/flip/rotate but what I'm looking for is something that >> extends that concept and allows for composing frames from multiple >> video capture devices into a single memory buffer which could then be >> encoded as a single stream. >> >> This was made possible by Carlo's gstreamer-imx [2] GStreamer plugins >> paired with the Freescale kernel that had some non-mainlined API's to >> the IMX IPU and GPU. We have used this to take for example 8x analog >> capture inputs, compose them into a single frame then H264 encode and >> stream it. The gstreamer-imx elements used fairly compatible >> properties as the GstCompositorPad element to provide a destination >> rect within the compose output buffer as well as rotation/flip, alpha >> blending and the ability to specify background fill. >> >> Is it possible that some of this capability might be available today >> with the opengl GStreamer elements? >> >> Best Regards, >> >> Tim >> >> [1] https://patchwork.kernel.org/patch/10768463/ >> [2] https://github.com/Freescale/gstreamer-imx >> >