Re: [PATCH v3 32/38] media: ti-vpe: cal: use CSI-2 frame number

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 07/06/2021 19:51, Laurent Pinchart wrote:
Hi Tomi,

On Mon, Jun 07, 2021 at 05:55:05PM +0300, Tomi Valkeinen wrote:
On 07/06/2021 16:42, Laurent Pinchart wrote:
On Mon, Jun 07, 2021 at 03:39:45PM +0300, Tomi Valkeinen wrote:
On 04/06/2021 17:04, Laurent Pinchart wrote:
On Mon, May 24, 2021 at 02:09:03PM +0300, Tomi Valkeinen wrote:
The driver fills buf->vb.sequence with an increasing number which is
incremented by the driver. This feels a bit pointless, as the userspace
could as well track that kind of number itself. Instead, lets use the

s/lets/let's/

frame number provided in the CSI-2 data from the sensor.

Signed-off-by: Tomi Valkeinen <tomi.valkeinen@xxxxxxxxxxxxxxxx>
---
    drivers/media/platform/ti-vpe/cal.c | 7 +++++--
    drivers/media/platform/ti-vpe/cal.h | 1 -
    2 files changed, 5 insertions(+), 3 deletions(-)

diff --git a/drivers/media/platform/ti-vpe/cal.c b/drivers/media/platform/ti-vpe/cal.c
index 888706187fd1..62c45add4efe 100644
--- a/drivers/media/platform/ti-vpe/cal.c
+++ b/drivers/media/platform/ti-vpe/cal.c
@@ -493,7 +493,6 @@ void cal_ctx_unprepare(struct cal_ctx *ctx)
void cal_ctx_start(struct cal_ctx *ctx)
    {
-	ctx->sequence = 0;
    	ctx->dma.state = CAL_DMA_RUNNING;
/* Configure the CSI-2, pixel processing and write DMA contexts. */
@@ -586,6 +585,10 @@ static inline void cal_irq_wdma_start(struct cal_ctx *ctx)
    static inline void cal_irq_wdma_end(struct cal_ctx *ctx)
    {
    	struct cal_buffer *buf = NULL;
+	u32 frame_num;
+
+	frame_num = cal_read(ctx->cal, CAL_CSI2_STATUS(ctx->phy->instance,
+						       ctx->csi2_ctx)) & 0xffff;
spin_lock(&ctx->dma.lock); @@ -607,7 +610,7 @@ static inline void cal_irq_wdma_end(struct cal_ctx *ctx)
    	if (buf) {
    		buf->vb.vb2_buf.timestamp = ktime_get_ns();
    		buf->vb.field = ctx->v_fmt.fmt.pix.field;
-		buf->vb.sequence = ctx->sequence++;
+		buf->vb.sequence = frame_num;

We'll need something a bit more complicated. The CSI-2 frame number is
not mandatory, and when used, it is a 16-bit number starting at 1 and
counting to an unspecified value larger than one, resetting to 1 at the
end of the cycle. The V4L2 sequence number, on the other hand, is a
monotonic counter starting at 0 and wrapping only at 2^32-1. We should
thus keep a software sequence counter and

- increase it by 1 if the frame number is zero
- increase it by frame_num - last_frame_num (with wrap-around of
     frame_num handled) otherwise

Ok... I wonder if we need a new field for this, though. The problem I
was solving when I changed this to use the CSI-2 frame-number was how to
associate a pixel frame and a metadata frame.

Their CSI-2 frame-numbers match (as they are from the same original
CSI-2 frame), so the userspace can use that to figure the matching
frames. While the above method you suggest should give us identical
sequence numbers for pixel and metadata, I think it's going a bit away
from my intended purpose, and possibly risks ending up with different
sequences for pixel and metadata.

Why do you think they could get out of sync (assuming the sensor
supports frame numbers of course, if it always returns 0, that's not
usable for the purpose of synchronization).

If there's a requirement that the sequence starts from 0, it doesn't
work, as the pixel and metadata video capture may be started at
different times. When pixel capture starts, the frame number could be 10
and pixel sequence would be 0, but when metadata capture starts, the
frame number could be 12, and pixel seq would thus be 2 and meta seq 0.

But even if we allow the seq to start from the current frame number,

Good point. I think we can allow starting at a non-zero value to handle
this.

this doesn't work if the frame number has wrapped between starting the
pixel capture and starting the meta capture.

The timestamp should be enough to handle this, the timestamp difference
between two wraparounds should be large enough to sync the two streams
without any risk.

Well, this still won't work, as CAL doesn't know when the sensor's frame counter wraps. CAL can detect that the counter has wrapped, but it doesn't know if some frames were missed. This leads to the two streams getting out of sync.

I'll try to figure out if I can somehow handle the frame counter in a shared manner, so that multiple streams that originate from the same frame would always use the same sequence numbers for same frame numbers.

 Tomi



[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]

  Powered by Linux