Add bit-depth change as one more reason which could change in the middle of the stream. For the worst case the stream is 8bit at the beginning but later in the bit-stream it changes to 10bit. That change should be propagated to the client so that it can take the appropriate action. In that case it has to stop the streaming on the capture queue, re-negotiate the pixel format, allocate new buffers and start the streaming again. Signed-off-by: Stanimir Varbanov <stanimir.varbanov@xxxxxxxxxx> --- Documentation/userspace-api/media/v4l/dev-decoder.rst | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/Documentation/userspace-api/media/v4l/dev-decoder.rst b/Documentation/userspace-api/media/v4l/dev-decoder.rst index 606b54947e10..45b31262f360 100644 --- a/Documentation/userspace-api/media/v4l/dev-decoder.rst +++ b/Documentation/userspace-api/media/v4l/dev-decoder.rst @@ -906,7 +906,9 @@ reflected by corresponding queries): * visible resolution (selection rectangles), -* the minimum number of buffers needed for decoding. +* the minimum number of buffers needed for decoding, + +* bit-depth of the bitstream has been changed. Whenever that happens, the decoder must proceed as follows: -- 2.17.1