Em Thu, 29 Apr 2021 19:53:33 +0300 Laurent Pinchart <laurent.pinchart@xxxxxxxxxxxxxxxx> escreveu: > Hi Mauro, > > On Thu, Apr 29, 2021 at 10:44:41AM +0200, Mauro Carvalho Chehab wrote: > > Em Thu, 29 Apr 2021 09:49:03 +0200 Marco Felsch escreveu: > > > On 21-04-29 04:51, Laurent Pinchart wrote: > > > > On Tue, Apr 27, 2021 at 02:06:56PM +0200, Marco Felsch wrote: > > > > > Add special 8/12bit bayer media bus format for the OnSemi AR0237IR > > > > > camera sensor [1]. OnSemi calls this format RGB-IR, the pixel array > > > > > with the interleaved IR pixels looks like: > > > > > > > > > > | G | R | G | B | ... > > > > > +----+----+----+----+--- > > > > > | IR | G | IR | G | ... > > > > > +----+----+----+----+--- > > > > > | G | B | G | R | ... > > > > > +----+----+----+----+--- > > > > > | IR | G | IR | G | ... > > > > > +----+----+----+----+--- > > > > > | .. | .. | .. | .. | .. > > > > > > > > > > [1] https://www.framos.com/media/pdf/96/ac/8f/AR0237CS-D-PDF-framos.pdf > > > > > > > > I think we're reaching a limit of the media bus codes model here, due to > > > > a historical mistake. The four possible Bayer patterns, times the > > > > different number of bits per pixel, creates a lot of media bus codes, > > > > and drivers for CSI-2 receivers and IP cores further down the pipeline > > > > have to support them all. > > > > > > That's correct but it is not bayer related. > > > > Err... there are two separate things here: > > > > 1) for the uAPI part, we're not even close to the limit of a 4-bytes > > fourcc; > > > > 2) the kAPI is currently sharing the same fourcc from the uAPI, > > because it is a lot simpler than doing something different. > > Please note that we're talking about media bus codes here, not pixel > formats. Both are part of the UAPI though, and pixel formats suffer from > a similar issue, but I'd like to focus on the media bus codes first. Yes, I'm aware of that, but the same principle used by fourcc pixel formats can also be applied to media bus codes and vice versa[1]. [1] IMO, a kAPI change like that should consider the big picture, and allow using the same process for both, even if we start implementing it for media bus (where it makes more sense). On both cases, we're talking about a 32-bit code (either encoded as fourcc or via MEDIA_BUS_FMT_* codespace). Both can be 1:1 mapped to some structure similar to: enum v4l2_pixformat_type { VIDEO_PIXFORMAT_RGB, VIDEO_PIXFORMAT_YUV, VIDEO_PIXFORMAT_COMPRESSED, VIDEO_PIXFORMAT_BAYER_RGB, VIDEO_PIXFORMAT_BAYER_RGB_IR, }; struct v4l2_pixformat_desc { enum v4l2_pixformat_type pixfmt_type; bool is_packed; int bits_per_component; union { enum v4l2_pixformat_rgb_order rgb_order; enum v4l2_pixformat_yuv_order yuv_order; enum v4l2_pixformat_bayer_rgb_order bayer_rgb_order; enum v4l2_pixformat_bayer_rgb_ir_order bayer_rgb_ir_order; enum v4l2_pixformat_compress_type compress_type; }; ... }; And new drivers can use such struct, instead of handling the fourcc/mbus code directly. Also, this can be gradually implemented, in order to avoid the need of touching at the existing drivers. Thanks, Mauro