Re: [PATCH v8 03/38] media: uapi: Add generic serial metadata mbus formats

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, Mar 20, 2024 at 08:48:17AM +0000, Sakari Ailus wrote:
> On Wed, Mar 20, 2024 at 01:00:48AM +0200, Laurent Pinchart wrote:
> > On Wed, Mar 20, 2024 at 12:33:48AM +0200, Laurent Pinchart wrote:
> > > On Tue, Mar 19, 2024 at 04:20:35PM +0200, Tomi Valkeinen wrote:
> > > > On 19/03/2024 15:27, Sakari Ailus wrote:
> > > > > On Thu, Mar 14, 2024 at 09:30:50AM +0200, Tomi Valkeinen wrote:
> > > > >> On 13/03/2024 09:24, Sakari Ailus wrote:
> > > > >>> Add generic serial metadata mbus formats. These formats describe data
> > > > >>> width and packing but not the content itself. The reason for specifying
> > > > >>> such formats is that the formats as such are fairly device specific but
> > > > >>> they are still handled by CSI-2 receiver drivers that should not be aware
> > > > >>> of device specific formats. What makes generic metadata formats possible
> > > > >>> is that these formats are parsed by software only, after capturing the
> > > > >>> data to system memory.
> > > > >>>
> > > > >>> Also add a definition for "Data unit" to cover what is essentially a pixel
> > > > >>> but is not image data.
> > > > >>
> > > > >> The CCS spec talks about legacy packing and optimized packing for 16+ bit
> > > > >> formats. You cover only the "legacy" ones here. Did you look at those?
> > > > > 
> > > > > The reason is that the bus data layout of the new packing at higher bit
> > > > > depth matches with packing at lower bit depths (half to be precise). That's
> > > > > why there's no need to define formats for the new packing methods at higher
> > > > > bit depths (the driver simply uses the packing at half of the bit depth).
> > > > 
> > > > Hmm. If we're capturing 10-bit raw format, say, with the width of 640 
> > > > pixels, we'll configure the video stream format according to those. For 
> > > > the embedded stream, we'll set it to V4L2_META_FMT_GENERIC_CSI2_10 and 
> > > > 640 width, right?
> > > > 
> > > > If we're capturing 20-bit raw, we'll configure the video stream format 
> > > > again accordingly, width to 640, and 20 bit fourcc/mbus code. If the 
> > > > embedded stream uses the "legacy" packing, we'll set the format to 
> > > > V4L2_META_FMT_GENERIC_CSI2_20 with width of 640, right?
> > > > 
> > > > But if it's using packed format for the embedded stream, we set the 
> > > > format to V4L2_META_FMT_GENERIC_CSI2_10 and width to 1280?
> > > > 
> > > > Considering that the video and (line-based) embedded data come from the 
> > > > same source, I'd expect the widths to be the same.
> > > 
> > > I don't have a strong objection against multiplying the width, but we
> > > need to figure out the impact on other kernel space components, as well
> > > as on userspace. I suppose the media bus code for the embedded data
> > > stream on the sensor source pad when using optimized packing and
> > > capturing RAW20 images would be MEDIA_BUS_FMT_META_10 ? In that case I
> > > think the sensor driver should be able to handle the width calculations
> > > on its own, and the value would just be propagated by userspace.
> > 
> > This should be documented somewhere in this series by the way (not in
> > this patch).
> 
> This could go to the CCS driver documentation. I modified the last
> paragraph and added a new one:
> 
> ------8<-----------
> Devices supporting embedded data for bit depths greater than or equal to 16 may
> support more dense packing or legacy single metadata byte per data unit, or both
> of these. The supported embedded data formats can be enumerated and configured
> on stream 1 of the source pad (1) of the CCS source sub-device.
> 
> The use of the denser packing results in embedded data lines being longer than
> the pixel data in data units since the data units are smaller. In bytes the
> embedded data lines are still not longer than the image data lines.

Please document explicitly that e.g. V4L2_META_FMT_GENERIC_CSI2_10 is
used for the RAW20 denser packing (in a general way that covers the
other formats too). You should also explain more explicitly that the
width is doubled in the relevant uAPI data structures.

This is not limited to CCS but is applicable to other sensors too, so
I'd like that documentation to be in a more generic place.

> ------8<-----------
> 
> I believe the reason for the specs requiring embedded data lines not being
> longer (in bytes) is most likely that some hardware may have issues
> reciving the data otherwise for various reasons.

-- 
Regards,

Laurent Pinchart




[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]

  Powered by Linux