Re: [PATCH 1/6] media: uapi: Add MEDIA_BUS_FMT_SGRGB_IGIG_GBGR_IGIG media bus formats

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Marco,

On Fri, Apr 30, 2021 at 08:51:34AM +0200, Marco Felsch wrote:
> On 21-04-30 01:14, Laurent Pinchart wrote:
> > On Thu, Apr 29, 2021 at 09:49:03AM +0200, Marco Felsch wrote:
> > > On 21-04-29 04:51, Laurent Pinchart wrote:
> > > > On Tue, Apr 27, 2021 at 02:06:56PM +0200, Marco Felsch wrote:
> > > > > Add special 8/12bit bayer media bus format for the OnSemi AR0237IR
> > > > > camera sensor [1]. OnSemi calls this format RGB-IR, the pixel array
> > > > > with the interleaved IR pixels looks like:
> > > > > 
> > > > >         |  G |  R |  G |  B | ...
> > > > >         +----+----+----+----+---
> > > > >         | IR |  G | IR |  G | ...
> > > > >         +----+----+----+----+---
> > > > >         |  G |  B |  G |  R | ...
> > > > >         +----+----+----+----+---
> > > > >         | IR |  G | IR |  G | ...
> > > > >         +----+----+----+----+---
> > > > >         | .. | .. | .. | .. | ..
> > > > > 
> > > > > [1] https://www.framos.com/media/pdf/96/ac/8f/AR0237CS-D-PDF-framos.pdf
> > > > 
> > > > I think we're reaching a limit of the media bus codes model here, due to
> > > > a historical mistake. The four possible Bayer patterns, times the
> > > > different number of bits per pixel, creates a lot of media bus codes,
> > > > and drivers for CSI-2 receivers and IP cores further down the pipeline
> > > > have to support them all.
> > > 
> > > That's correct but it is not bayer related. Currently it is what it is,
> > > if a new code is added it must be propagated through all the involved
> > > subdevs. On the other hand I wouldn't say that it is better to support
> > > new codes per default for all drivers. Since this would add a lot of
> > > untested code paths.
> > 
> > It's not an issue limited to Bayer patterns, but they make the issue
> > worse given the number of possible combinations (think about adding DPCM
> > and ALAW compression on top of that...).
> 
> You're right and again this will apply to all mbus formats...
> 
> > > > This is already painful, and if we had a
> > > > non-Bayer pattern such as this one,
> > > 
> > > That's not exactly true since it is a bayer pattern but instead of using
> > > 4x4 it uses 8x8 and it as some special pixels.
> > > 
> > > > we'll open the door to an explosion
> > > > of the number of media bus codes (imagine all the different possible
> > > > arrangements of this pattern, for instance if you enable horizontal
> > > > and/or vertical flipping on the sensor). All drivers would need to be
> > > > updated to support these new bus codes, and this really kills the
> > > > current model.
> > > 
> > > Yep, I know what you mean but as I said above I think that adding it
> > > explicite is the better abbroach since it involves somone who add _and_
> > > test the new code on the particular platform.
> > > 
> > > > The historical mistake was to tie the Bayer pattern with the media bus
> > > > code. We should really have specified raw 8/10/12/14/16 media bus codes,
> > > > and conveyed the pattern separately. Most IP cores in the pipeline don't
> > > > need to care about the pattern at all, and those who do (that's mostly
> > > > ISPs) could be programmed explicitly by userspace as long as we have an
> > > > API to retrieve the pattern from the sensor. I believe it's time to bite
> > > > the bullet and go in that direction. I'm sorry for this case of yak
> > > > shaving, but it really wouldn't be manageable anymore :-(
> > > 
> > > I got all your points and would agree but this is not a bayer only
> > > related problem. You will have this problem with all new other formats
> > > as well. I'm with you, most IP cores don't care but I wouldn't
> > > guarantee that.
> > 
> > Sorry, but adding new media bus formats like this one will just not
> > scale. We have two options, either fixing the issue, or considering that
> > V4L2 is a barely alive API with no future, and merging this without
> > caring anymore.
> 
> Hm.. you're right that it doesn't scale, as I said I'm absolute on your
> side. So let us consider a new approach @Mauro, @Hans, @Sailus what do
> you think about?

Starting brainstorming, how about new media bus codes for
raw{8,10,12,14,16}, and a read-only CFA pattern control to retrieve the
pattern from the sensor subdev ? We could use the same control to set
the pattern on subdevs that require it, which would mostly be ISPs. As
ISPs are configured using parameter buffers these days, it may be better
to pass the pattern in the parameter buffer instead though.

This shouldn't be too hard to implement, but the devil is of course in
the details, and we should consider how to handle the pattern control
when flipping and/or cropping is configured on the sensor.

> BTW: IMHO videobuf2 interface isn't that good as well, since you are
> blaming ;)

Have you looked at videobuf1 ? ;-) Jokes aside, there's certainly room
for improvement, but it hasn't struck me as a particularly bad part of
the framework. Is there anything in particular you think is painful ?

-- 
Regards,

Laurent Pinchart



[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]

  Powered by Linux