On Tue, May 26, 2020 at 12:26 PM Hans Verkuil <hverkuil@xxxxxxxxx> wrote: > > On 30/04/2020 13:38, Stanimir Varbanov wrote: > > Here we add two more reasons for dynamic-resolution-change state > > (I think the name is misleading espesially "resolution" word, maybe > > espesially -> especially > > > dynamic-bitstream-change is better to describe). > > > > The first one which could change in the middle of the stream is the > > bit-depth. For worst example the stream is 8bit at the begging but > > later in the bitstream it changes to 10bit. That change should be > > propagated to the client so that it can take appropriate action. In > > this case most probably it has to stop the streaming on the capture > > queue and re-negotiate the pixel format and start the streaming > > again. > > > > The second new reason is colorspace change. I'm not sure what action > > client should take but at least it should be notified for such change. > > One possible action is to notify the display entity that the colorspace > > and its parameters (y'cbcr encoding and so on) has been changed. > > > > Signed-off-by: Stanimir Varbanov <stanimir.varbanov@xxxxxxxxxx> > > --- > > Documentation/userspace-api/media/v4l/dev-decoder.rst | 6 +++++- > > 1 file changed, 5 insertions(+), 1 deletion(-) > > > > diff --git a/Documentation/userspace-api/media/v4l/dev-decoder.rst b/Documentation/userspace-api/media/v4l/dev-decoder.rst > > index 606b54947e10..bf10eda6125c 100644 > > --- a/Documentation/userspace-api/media/v4l/dev-decoder.rst > > +++ b/Documentation/userspace-api/media/v4l/dev-decoder.rst > > @@ -906,7 +906,11 @@ reflected by corresponding queries): > > > > * visible resolution (selection rectangles), > > > > -* the minimum number of buffers needed for decoding. > > +* the minimum number of buffers needed for decoding, > > + > > +* bit-depth of the bitstream has been changed, > > + > > +* colorspace (and its properties) has been changed. > > For this I want to have a new source change flag: > > V4L2_EVENT_SRC_CH_COLORIMETRY > > Changing colorimetry without changing resolution/bit depth does not > require buffers to be re-allocated, it just changes how the pixel > data is interpreted w.r.t. color. And that is important to know. FWIW, the visible resolution (i.e. compose rectangle) change that is already defined doesn't require buffers to be re-allocated either. Backwards compatibility requires V4L2_EVENT_SRC_CH_RESOLUTION to be set, but perhaps we could have further flags introduced, which would mean visible resolution and stream format (pixelformat, resolution) exclusively? Best regards, Tomasz