Re: v4l2-compliance: input and output versus Media UAPI V4L doc

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

Le vendredi 24 janvier 2025 à 12:50 +1100, Stephen Wade a écrit :
> This is my first question - any feedback or direction is welcome.
> 
> I am trying to 'square' the documentation available for Media UAPI V4L
> with the expected behaviour.
> 
> In particular I'm trying to match the following line from Media UAPI
> Video for Linux §1.1.3:
> 
> > Today each V4L2 device node supports just one function.

The definition of function is extremely vague in that document and there is room
for improvement.

Despite that class of device existed already, the text does not mention Memmory-
to-Memory devices. These are single device, onto which you must allocate 2 set
of buffers, once for OUTPUT (which technically is the driver input stream) and
one for CAPTURE. That must be set on the same instance, since each time you call
open() you get fresh instance. It has no global state like other devices.

> 
> with the statement in § 3.2 Streaming I/O:
> 
> > A driver can support many sets of buffers. Each set is identified by a unique buffer type value. The sets are independent and each set can hold a different type of data. To access different sets at the same time different file descriptors must be used. [footnote]One could use one file descriptor and set the buffer type field accordingly when calling ioctl VIDIOC_QBUF, VIDIOC_DQBUF etc., but it makes the select() function ambiguous. We also like the clean approach of one file descriptor per logical stream...[/footnote]
> 
> This latter quote reads as though it should be possible to open() a
> node twice (i.e. two file handles) to access the different buffer
> types, e.g. V4L2_BUF_TYPE_VIDEO_CAPTURE and then
> V4L2_BUF_TYPE_VIDEO_OUTPUT, but this is not compliant - see e.g. the
> tests in v4l2-compliance.cpp within v4l-utils:
> 
> if (dcaps & input_caps)
> fail_on_test(dcaps & output_caps);
> 
> 1. Is § 3.2 actually meant to say "To access different sets at the
> same time different _device nodes_ must be used."? Or could it state
> something like "To access different sets available to a node,
> different file descriptors must be used". Mind you, I can't think of a
> situation where concurrent access to different buffer-types makes
> sense (but I am a newbie).

That is true notably for UVC with video capture and metadata capture. Though,
capabilities are not 1:1 to the available queue types (see V4L2_CAP_VIDEO_M2M
which makes the memory-to-memory device actually possible withing breaking that
conformance test).

I think what is impossible with the API, is to have multiple queues and
arbitrarily pick one and not use the other. With multiple devices, the streaming
state (strreamon/off) becomes per queue, removing this limitation. It also allow
you to have different process drive different queues as a bonus.

To make an audio analogy, its a bit like have N channels in one connector, vs
having N connectors.

> 
> 2. Could (or should) § 1.1.3 be more explicit that "function" is a
> combination of input vs output, and medium (e.g. video, radio, audio)?

I agree there is room for improvement. From my reading, 1 function match one
capability. Nothing prevents us to introduce function that requires 2 or more
queues, as long as they need to be all operated at the same time using the same
streaming state.

Nicolas

> 
> Kind regards,
> -Stephen
> 






[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]

  Powered by Linux