Re: [Media Summit] Imaging Sensor functionality

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, Sep 07, 2022 at 01:42:16PM +0100, Dave Stevenson wrote:
> On Tue, 6 Sept 2022 at 18:53, Laurent Pinchart wrote:
> > On Tue, Sep 06, 2022 at 05:14:30PM +0100, Dave Stevenson wrote:
> > > Hi All.
> > >
> > > I realise that I'm in a slightly different position from many mainline
> > > Linux-media developers in that I see multiple use cases for the same
> > > sensor, rather than a driver predominantly being for one
> > > product/platform. I'm therefore wanting to look at generic solutions
> > > and fully featured drivers. Users get to decide the use cases, not the
> > > hardware designers.
> >
> > Could you clarify here what you mean by users and hardware designers ?
> > Users can be understood as
> >
> > - Users of the camera sensor, i.e. OEMs designing a product
> > - Users of the hardware platform , i.e. software developers writing
> >   applications
> > - Users of the software, i.e. end-users
> 
> Users of the software.
> 
> Particularly on the Pi you have people using libcamera apps or Python
> bindings that want to be able to choose modes of operation without
> having to make kernel driver modifications.
> I generally don't mind if that is through userspace or DT, but the
> functionality should be exposed.
> 
> As an example, when the strobe signals were exposed for IMX477 we had
> people hooking up various high intensity strobe devices and other
> weird contraptions for synchronised events [1]. Can we replicate that
> sort of open-ended functionality in a standardised way within sensor
> kernel drivers so that the drivers are not constraining the use cases?

We have the same goal, so let's see if we can find a way to make it
happen :-)

> > Hardware designers could then equally mean
> >
> > - Sensor vendors
> > - SoC vendors
> > - Board vendors
> > - Product vendors
> 
> All of the above.
> 
> For those Product Vendors designing specific products based on an SoC
> and imaging sensor, if there is a defined mechanism that end users can
> get to, then they can also use it to configure their specific use
> case. Both cases therefore win. Hard coding their product's use case
> in a mainline driver limits other use cases.
> 
>   Dave
> 
> [1] https://forums.raspberrypi.com/viewtopic.php?t=281913
> 
> > > The issues I've raised are things that I've encountered and would
> > > benefit from a discussion to get views as to the direction that is
> > > perceived to be workable. I appreciate that some can not be solved
> > > immediately, but want to avoid too much bikeshedding in the first
> > > round of patches.
> > > What's realistic, and what pitfalls/limitations immediately jump out at people.
> > >
> > > Slides are at https://drive.google.com/file/d/1vjYJjTNRL1P3j6G4Nx2ZpjFtTBTNdeFG/view?usp=sharing
> >
> > Thank you, I will review that ASAP.
> >
> > > See you on Monday.

-- 
Regards,

Laurent Pinchart



[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]

  Powered by Linux