Re: [RFC 0/7] Generic line based metadata support, internal pads

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Naushir,

On Fri, Jun 02, 2023 at 08:54:35AM +0100, Naushir Patuck wrote:
> Hi Sakari,
> 
> Thank you for working on this. Sensor metadata is something that
> Raspberry Pi do make extensive use of, and our downstream changes to
> support it, although a bit hacky, are not too dissimilar to your proposal
> here.
> 
> On Fri, 5 May 2023 at 22:53, Sakari Ailus <sakari.ailus@xxxxxxxxxxxxxxx> wrote:
> >
> > Hi folks,
> >
> > Here are a few patches to add support generic, line based metadata as well
> > as internal source pads. While the amount of code is not very large, to
> > the contrary it is quite small actually IMO, I presume what this is about
> > and why it is being proposed requires some explaining.
> >
> > Metadata mbus codes and formats have existed for some time in V4L2. They
> > however have been only used by drivers that produce the data itself and
> > effectively this metadata has always been statistics of some sort (at
> > least when it comes to ISPs). What is different here is that we intend to
> > add support for metadata originating from camera sensors.
> >
> > Camera sensors produce different kinds of metadata, embedded data (usually
> > register address--value pairs used to capture the frame, in a more or less
> > sensor specific format), histograms (in a very sensor specific format),
> > dark pixels etc. The number of these formats is probably going to be about
> > as large as image data formats if not larger, as the image data formats
> > are much better standardised but a smaller subset of them will be
> > supported by V4L2, at least initially but possibly much more in the long
> > run.
> >
> > Having this many device specific formats would be a major problem for all
> > the other drivers along that pipeline (not to mention the users of those
> > drivers), including bridge (e.g. CSI-2 to parallel) but especially CSI-2
> > receiver drivers that have DMA: the poor driver developer would not only
> > need to know camera sensor specific formats but to choose the specific
> > packing of that format suitable for the DMA used by the hardware. It is
> > unlikely many of these would ever get tested while being present on the
> > driver API. Also adding new sensors with new embedded data formats would
> > involve updating all bridge and CSI-2 receiver drivers. I don't expect
> > this to be a workable approach.
> >
> > Instead what I'm proposing is to use specific metadata formats on the
> > sensor devices only, on internal pads (more about those soon) of the
> > sensors, only visible in the UAPI, and then generic mbus formats along the
> > pipeline and finally generic V4L2 metadata formats on the DMAs (specific
> > to bit depth and packing). This would unsnarl the two, defining what data
> > there is (specific mbus code) and how that is transported and packed
> > (generic mbus codes and V4L2 formats).
> >
> > The user space would be required to "know" the path of that data from the
> > sensor's internal pad to the V4L2 video node. I do not see this as these
> > devices require at least some knowledge of the pipeline, i.e. hardware at
> > hand. Separating what the data means and how it is packed may even be
> > beneficial: it allows separating code that interprets the data (sensor
> > internal mbus code) from the code that accesses it (packing).
> >
> > These formats are in practice line based, meaning that there may be
> > padding at the end of the line, depending on the bus as well as the DMA.
> > If non-line based formats are needed, it is always possible to set the
> > "height" field to 1.
> 
> One thing that may be worth considering or clarifying - for the case of
> the BCM2835 Unicam CSI-2 device, we only have 2x DMA output channels. So
> one will match image data packets, and the other will match "everything
> else". Typically "everything else" would only be CSI-2 embedded data, but
> in the case of the Raspberry Pi Camera v3 (IMX708), it includes embedded
> data, PDAF data, and HDR histogram data. Each of these outputs can be
> programmed to use a different packet ID in the sensor, but since Unicam
> only has a single DMA for "everything else", it all gets dumped into one
> metadata buffer. But given we know the exact structure of the data
> streams, it's trivial for useland to find the right bits in this buffer.
> Of course, other CSI-2 receivers with more DMA channels might allow these
> streams to end up in their own buffers.
> 
> Nothing in your series seems to stop us operating Unicam in this way,
> particularly because there is no fixed definition of the data format for
> V4L2_META_FMT_GENERIC_8. So I don't think it's a problem, but perhaps
> it's worth documenting that the metadata might include multiple streams
> from the sensor?

I believe this happens on other hardware, too, indeed. Currently the
documentation says that

	Any number of routes from streams on sink pads towards streams on
	source pads is allowed, to the extent supported by drivers. For
	every stream on a source pad, however, only a single route is
	allowed.

	(Documentation/userspace-api/media/v4l/dev-subdev.rst)

This probably needs to be changed to allow what you'd need?

-- 
Kind regards,

Sakari Ailus



[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]

  Powered by Linux