Re: [RFC] Video events, version 2.2

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 11/13/2009 11:05 AM, Sakari Ailus wrote:
Eino-Ville Talvala wrote:
I think we have a use case for events that would require correlating with frames, although I agree that the buffer index would be far simpler to match with than a timestamp. The specific feature is letting the application know exactly what sensor settings were used with a given frame, which is essential for our slowly-developing computational camera API, which will be changing sensor parameters on nearly every frame boundary.

I think one event is probably sufficient to encode the relevant register values of our sensor. Would you expect there to be any issue with having an event happen per frame?

I do expect several events per frame from the AEWB, AF and histogram statistics and no problems. :-)

But if I understand correctly, the registers are some kind of metadata associated to the frame? That perhaps includes exposure time, gain etc. The events interface would be good for this if the metadata fits to a single v4l2_event structure. A new ioctl could be an alternative, perhaps it could be a private ioctl first.

This is more or less comparable to the H3A statistics IMO. So the user space gets an event and can query the H3A data.

Associating events to a single frame is slightly troublesome since a succesful frame reception is only certain when it already has happened. There could be a metadata event and after that a receive buffer overflow that spoils the frame. In that case the field_count could be just incremented without dequeueing any buffers, though.

Right, all of the sensor settings that applied to that particular frame. We're changing the sensor settings on nearly every frame, so it's fairly key to keep track of them in some way, and events seem to be far nicer solution than what we currently do (which involves abusing the frame input field, as it was the fastest thing I saw to hack in).

Of course, the event queue and frame queue would have to be kept in sync, or just let the app discard events that apply to frames it never saw - as long as the event queue is a bit bigger than the frame queue, I don't think there'd be a problem in practice.

Eino-Ville Talvala
Camera 2.0 Project
Stanford University
--
To unsubscribe from this list: send the line "unsubscribe linux-media" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]
  Powered by Linux