Re: [PATCH v8 03/38] media: uapi: Add generic serial metadata mbus formats

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 20/03/2024 00:33, Laurent Pinchart wrote:
On Tue, Mar 19, 2024 at 04:20:35PM +0200, Tomi Valkeinen wrote:
On 19/03/2024 15:27, Sakari Ailus wrote:
On Thu, Mar 14, 2024 at 09:30:50AM +0200, Tomi Valkeinen wrote:
On 13/03/2024 09:24, Sakari Ailus wrote:
Add generic serial metadata mbus formats. These formats describe data
width and packing but not the content itself. The reason for specifying
such formats is that the formats as such are fairly device specific but
they are still handled by CSI-2 receiver drivers that should not be aware
of device specific formats. What makes generic metadata formats possible
is that these formats are parsed by software only, after capturing the
data to system memory.

Also add a definition for "Data unit" to cover what is essentially a pixel
but is not image data.

The CCS spec talks about legacy packing and optimized packing for 16+ bit
formats. You cover only the "legacy" ones here. Did you look at those?

The reason is that the bus data layout of the new packing at higher bit
depth matches with packing at lower bit depths (half to be precise). That's
why there's no need to define formats for the new packing methods at higher
bit depths (the driver simply uses the packing at half of the bit depth).

Hmm. If we're capturing 10-bit raw format, say, with the width of 640
pixels, we'll configure the video stream format according to those. For
the embedded stream, we'll set it to V4L2_META_FMT_GENERIC_CSI2_10 and
640 width, right?

If we're capturing 20-bit raw, we'll configure the video stream format
again accordingly, width to 640, and 20 bit fourcc/mbus code. If the
embedded stream uses the "legacy" packing, we'll set the format to
V4L2_META_FMT_GENERIC_CSI2_20 with width of 640, right?

But if it's using packed format for the embedded stream, we set the
format to V4L2_META_FMT_GENERIC_CSI2_10 and width to 1280?

Considering that the video and (line-based) embedded data come from the
same source, I'd expect the widths to be the same.

I don't have a strong objection against multiplying the width, but we
need to figure out the impact on other kernel space components, as well
as on userspace. I suppose the media bus code for the embedded data
stream on the sensor source pad when using optimized packing and
capturing RAW20 images would be MEDIA_BUS_FMT_META_10 ? In that case I
think the sensor driver should be able to handle the width calculations
on its own, and the value would just be propagated by userspace.

Yes, I think it works. I just find it more logical if the widths of both the video and embedded streams are the same (which is the case for all other embedded formats).

However, even the CCS spec says "for RAW16, RAW20, and/or RAW24 Visible pixels, top-embedded data may instead be more optimally packed using RAW8, RAW10, and/or RAW12 pixels", so that's in line with what Sakari suggests. Although the spec specifically says "top-embedded", so does it mean that the optimized data is not allowed for bottom data?

 Tomi





[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]

  Powered by Linux