Re: [Query] V4L2 Integer (?) menu control

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Sylwester,

On Fri, Nov 25, 2011 at 12:53:16AM +0100, Sylwester Nawrocki wrote:
> On 11/24/2011 09:57 PM, Sakari Ailus wrote:
> > Sylwester Nawrocki wrote:
> >> On 11/24/2011 09:50 AM, Sakari Ailus wrote:
> >>>
> >>> There is not currently, but I have patches for it. The issue is that I need
> >>> them myself but the driver I need them for isn't ready to be released yet.
> >>> And as usual, I assume others than vivo is required to show they're really
> >>> useful so I haven't sent them.
> >>
> >> That's great news. Then I might not need to do all the work on my own;)
> > 
> > I hope mine will do. ;-)
> > 
> > I'm working on 2.6.32 kernel (ouch!) so I haven't been able to test them properly
> > yet. Please provide feedback on them if you find any issues.
> > 
> >>>
> >>> Good that you asked so we won't end up writing essentially the same code
> >>> again. I'll try to send the patches today.
> >>
> >> All right, there is no rush. I was just looking around how to support the
> >> camera scene mode with m5mols sort of sensors. The scene mode is essentially
> >> a compilation of several different parameters, for some of which there are
> >> standard controls in V4L2 but for many there are not.
> > 
> > I fully agree with this approach. Scene modes should not be implemented at the
> > level of the V4L2 API. Instead, the parameters that the scene modes consist of
> > must be shown separately on the V4L2 API, if that is the level of API they belong
> > to. Depending on your camera stack control algorithms could reside in the user 
> > space, which I believe is however not the case with the M5-MOLS.
> 
> No, with these hybrid camera devices the algorithms are built in their own ISP.
> And there is quite many advanced algorithms, e.g. auto focus/face detection that 
> are difficult to control at the subdevice API level.

Can you tell what makes it difficult?

> The issue is that the subdev API seems to low level for the device but it's
> the only API available at the user space ;) 

...

> > This makes your user space to depend both on the sensor and the ISP, but there's
> > really no way around that if both do non-trivial hardware-specific things.
> 
> I guess a dedicated library for the sensor itself is needed on top of subdevice API
> to be able to use advanced features. And even then subdevice/V4L2 API is a limitation.

How is it a limitation?

Whe whole intent is to provide as standard as possible way to access the
hardware features through an interface provided by the driver. So what is
missing in your opinion? :-)

> > I think we need to further standardise image processing configuration such as 
> > RGB-to-RGB matrices and gamma tables. This would make the ISP interfaces less 
> > hardware specific.
> 
> I guess first we need at least one more OMAP3 ISP like device driver in mainline 
> to identify common features and design APIs for them. On the other hand gamma tables
> are also present in some embedded ISPs, e.g. in s5k6aafx IIRC. 

Or get more public specs for different ISPs. Or just read the existing specs
more. ;-) The OMAP 4 ISS spec is public. Even if it's from TI as well it's
very different from the OMAP 3 ISP.

-- 
Sakari Ailus
e-mail: sakari.ailus@xxxxxx	jabber/XMPP/Gmail: sailus@xxxxxxxxxxxxxx
--
To unsubscribe from this list: send the line "unsubscribe linux-media" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]
  Powered by Linux