Re: [ANN] Report of Media Summit: V4L2 Future Work

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Le samedi 02 novembre 2019 à 15:06 +0100, Hans Verkuil a écrit :
> Action Items
> ------------
> 
> Nicolas Dufresne:
> 
> - provide more info about timecode use in userspace for Hans Verkuil
>   to verify if struct v4l2_timecode provides sufficient information.

Initial context we have:

   struct v4l2_timecode {
   	__u32	type; /* This is the FPS */
   	__u32	flags;
   	__u8	frames;
   	__u8	seconds;
   	__u8	minutes;
   	__u8	hours;
   	__u8	userbits[4];
   };

GStreamer on it's side exposes:

   struct _GstVideoTimeCode {
     GstVideoTimeCodeConfig config;

     guint hours;
     guint minutes;
     guint seconds;
     guint frames;
     guint field_count;
   };

   struct _GstVideoTimeCodeConfig {
     guint fps_n;
     guint fps_d;
     GstVideoTimeCodeFlags flags;
     GDateTime *latest_daily_jam;
   };
See https://gstreamer.freedesktop.org/documentation/video/gstvideotimecode.html?gi-language=c#GstVideoTimeCode

So the main difference is that GST allow arbitrary framerate, where
V4L2 uses a type for this purpose. I'm not sure how things like 29.97
is really handled in V4L2, is it exposed as 30fps and then handled with
drop frames ? That isn't very clear, but basically it is more of less
the same thing as Gst, and there is likely possible glue to make this
work.

In GStreamer, time code have been implemented when muxing into ISOMP4
format. For H264 and HEVC, they can be encoded into the bitstream and
extracted by the parser. We also have the ability to combine these
timecode with the close caption data. In therm of CAPTURE and OUTPUT,
we have support for decklink PCI card, which are SDI capture/output
card with currently proprietary drivers and userspace on Linux [1].

Real usage started around GStreamer 1.12 (so it's not that old) and was
needed for live television broadcast authoring software. The timecode
are (to the best of my knowledge) mostly used to synchronized the
streams together, as the streaming timestamp (e.g. RTP) might not
correlate with the moment the recording have happened.

It is also used when storing the live recording to disk, so that later
non-linear editor can align the tracks based on these timecode. There
is likely loads of other use case I'm not aware of, it's basically
information upon when this live recording have happened, could be few
seconds in the past, or years for recording.

regards,
Nicolas


[1] https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/blob/master/sys/decklink/linux/DeckLinkAPI.h#L794





[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]

  Powered by Linux