Hi Laurent, On Tue, Jul 04, 2023 at 03:57:20PM +0300, Laurent Pinchart wrote: > Hi Sakari, > > On Fri, Jun 30, 2023 at 11:43:31PM +0300, Sakari Ailus wrote: > > Hi folks, > > > > Here are a few patches to add support generic, line based metadata as well > > as internal source pads. While the amount of code is not very large, to > > the contrary it is quite small actually IMO, I presume what this is about > > and why it is being proposed requires some explaining. > > > > Metadata mbus codes and formats have existed for some time in V4L2. They > > however have been only used by drivers that produce the data itself and > > effectively this metadata has always been statistics of some sort (at > > least when it comes to ISPs). What is different here is that we intend to > > add support for metadata originating from camera sensors. > > I've just realized that, unless I'm mistaken, we've never documented > which of the v4l2_mbus_framefmt fields are valid for metadata formats. > Should this be fixed as part of this series ? Good point. I'll add a patch to document this. It's bound to be dependent on mbus code as we don't have (which is a good thing) any indication of type of the data there. > > > Camera sensors produce different kinds of metadata, embedded data (usually > > register address--value pairs used to capture the frame, in a more or less > > sensor specific format), histograms (in a very sensor specific format), > > dark pixels etc. The number of these formats is probably going to be about > > as large as image data formats if not larger, as the image data formats > > are much better standardised but a smaller subset of them will be > > supported by V4L2, at least initially but possibly much more in the long > > run. > > > > Having this many device specific formats would be a major problem for all > > the other drivers along that pipeline (not to mention the users of those > > drivers), including bridge (e.g. CSI-2 to parallel) but especially CSI-2 > > receiver drivers that have DMA: the poor driver developer would not only > > need to know camera sensor specific formats but to choose the specific > > packing of that format suitable for the DMA used by the hardware. It is > > unlikely many of these would ever get tested while being present on the > > driver API. Also adding new sensors with new embedded data formats would > > involve updating all bridge and CSI-2 receiver drivers. I don't expect > > this to be a workable approach. > > > > Instead what I'm proposing is to use specific metadata formats on the > > sensor devices only, on internal pads (more about those soon) of the > > sensors, only visible in the UAPI, and then generic mbus formats along the > > pipeline and finally generic V4L2 metadata formats on the DMAs (specific > > to bit depth and packing). This would unsnarl the two, defining what data > > there is (specific mbus code) and how that is transported and packed > > (generic mbus codes and V4L2 formats). > > > > The user space would be required to "know" the path of that data from the > > sensor's internal pad to the V4L2 video node. I do not see this as these > > devices require at least some knowledge of the pipeline, i.e. hardware at > > hand. Separating what the data means and how it is packed may even be > > beneficial: it allows separating code that interprets the data (sensor > > internal mbus code) from the code that accesses it (packing). > > > > These formats are in practice line based, meaning that there may be > > padding at the end of the line, depending on the bus as well as the DMA. > > If non-line based formats are needed, it is always possible to set the > > "height" field to 1. > > > > The internal (source) pads are an alternative to source routes [1]. The > > source routes were not universally liked and I do have to say I like > > re-using existing interface concepts (pads and everything you can do with > > pads, including access format, selections etc.) wherever it makes sense, > > instead of duplicating functionality. > > > > Effectively internal source pads behave mostly just like sink pads, but > > they describe a flow of data that originates from a sub-device instead of > > arriving to a sub-device. The SUBDEV_S_ROUTING IOCTLs are used to enable > > and disable routes from internal source pads to sub-device's source pads. > > The subdev format IOCTLs are usable, too, so one can find which subdev > > format is available on given internal source pad. > > > > This set depends on these patches: > > > > <URL:https://lore.kernel.org/linux-media/20230505205416.55002-1-sakari.ailus@xxxxxxxxxxxxxxx/T/#t> > > > > I've also pushed these here and I'll keep updating the branch: > > > > <URL:https://git.linuxtv.org/sailus/media_tree.git/log/?h=metadata> > > > > Questions and comments are most welcome. > > > > Driver support is to be added, as well as an example. > > Jacopo and I are working on this, including libcamera support. Thanks, this is much appreciated! -- Kind regards, Sakari Ailus