I am building a gstreamer camera source. It sits directly on top of the raw hardware and takes advantage of various system oddities. It generates 3 srcs - an encoded video stream, a occasional high resolution snapshot (based on request), and a low resolution preview data stream. I want all of them tagged with real-time timestamps (that is the actual day/date/time from the system), but accurate to the fractional second of the frame generation. I cannot figure out the approved way to do this. The implementations I have found, (v4l2 for example), simply use the system clock and attach a timestamp as the buffer is generated at the user level. This fails badly in a loaded embedded device - the variance in thread servicing shows up in the timestamping, resulting in jerky video. I have accurate timestamps attached to the buffers coming up from the drivers. How can I take advantage of this? The simplistic answer of directly converting this into timestamps on each buffer works fine for a single src, but with multiple src-s the pipeline fails to get out of pre-roll. I am based on top of base_src which is driving the video stream src, and sample queues for each of the lower speed sources in the create function, pushing buffers if they are there. any help would be appreciated... -- Steve