On Thu, Jul 18, 2019 at 11:22 AM Hans Verkuil <hverkuil@xxxxxxxxx> wrote: > > On 7/18/19 10:39 AM, Maxime Jourdan wrote: > > On Mon, Jul 15, 2019 at 2:37 PM Hans Verkuil <hverkuil@xxxxxxxxx> wrote: > >> > >> On 6/11/19 10:13 AM, Hans Verkuil wrote: > >>> On 6/9/19 4:38 PM, Maxime Jourdan wrote: > >>>> Hello, > >>>> > >>>> This RFC proposes a new format flag - V4L2_FMT_FLAG_DYN_RESOLUTION - used > >>>> to tag coded formats for which the device supports dynamic resolution > >>>> switching, via V4L2_EVENT_SOURCE_CHANGE. > >>>> This includes the initial "source change" where the device is able to > >>>> tell userspace about the coded resolution and the DPB size (which > >>>> sometimes translates to V4L2_CID_MIN_BUFFERS_FOR_CAPTURE). > >>> > >>> Shouldn't the initial source change still be there? The amlogic decoder > >>> is capable of determining the resolution of the stream, right? It just > >>> can't handle mid-stream changes. > >> > >> I've been thinking about this a bit more: there are three different HW capabilities: > >> > >> 1) The hardware cannot parse the resolution at all and userspace has to tell it > >> via S_FMT. > >> > >> 2) The hardware can parse the initial resolution, but is not able to handle > >> mid-stream resolution changes. > >> > >> 3) The hardware can parse the initial resolution and all following mid-stream > >> resolution changes. > >> > >> We can consider 2 the default situation. > >> > >> In case of 1 the SOURCE_CHANGE event is absent and userspace cannot subscribe > >> to it. Question: do we want to flag this with the format as well? I.e. with a > >> V4L2_FMT_FLAG_MANUAL_RESOLUTION? I think just not implementing the SOURCE_CHANGE > >> event (and documenting this) is sufficient. > >> > > > > I think that not implementing SOURCE_CHANGE is sufficient as well. The > > issue (in my case), is that the amlogic decoder _does_ support the > > event (case 3) for anything recent (H264, HEVC, VP9), but not for e.g > > MPEG 1/2 (case 1). > > > > A possible solution would be to create 2 separate devices, one > > implementing the event, the other not. Do you think this is reasonable > > ? This would discard the need for all the proposed flags, unless there > > are other decoder drivers that fall in case 2. > > I don't think it is a good idea to create two device nodes, that's really > confusing. Instead I think we just need a V4L2_FMT_FLAG_MANUAL_RESOLUTION > flag. > I guess I just feel bad about adding a flag (MANUAL_RESOLUTION) for what is basically a problem with one compression standard for one driver, with the root cause being bad firmware design. Then again I don't see a way around it, and case 1 & 2 are indeed two possibilities that need their own flag. I'll prepare 2 new patch series if that is okay with you: - DYN_RESOLUTION format flag updated series (in this current RFC, there are issues with the explanation of the flag in the doc) - Adding MANUAL_RESOLUTION format flag > BTW, what happens if the application sets the format to e.g. 640x480 but > the MPEG file is a different resolution? Does the decoder fail to produce > anything? Or does it internally parse the resolution from the bitstream > and start decoding it? What if the bitstream resolution is larger than the > resolution set with S_FMT? Does it check for the buffer size? > > I just want to make sure it won't write past the end of the buffer. > I tested this case a long while ago.The DMAs are programmed with the allocated VB2 buffers, so you get cropped pictures (and no DMA overflow). > Regards, > > Hans > > > > >> In case of 3 the format sets the V4L2_FMT_FLAG_DYN_RESOLUTION flag. > >> > >> What do you think? > >> > >> Regards, > >> > >> Hans