Re: [PATCH v6 09/15] media: bcm2835-unicam: Add support for CCP2/CSI2 camera interface

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Laurent, Naush,

On 21/03/2024 20:57, Laurent Pinchart wrote:
Hi Naush,

On Wed, Mar 20, 2024 at 12:30:36PM +0000, Naushir Patuck wrote:
On Fri, 1 Mar 2024 at 21:32, Laurent Pinchart wrote:

From: Dave Stevenson <dave.stevenson@xxxxxxxxxxxxxxx>

Add a driver for the Unicam camera receiver block on BCM283x processors.
It is represented as two video device nodes: unicam-image and
unicam-embedded which are connected to an internal subdev (named
unicam-subdev) in order to manage streams routing.

Signed-off-by: Dave Stevenson <dave.stevenson@xxxxxxxxxxxxxxx>
Co-developed-by: Naushir Patuck <naush@xxxxxxxxxxxxxxx>
Signed-off-by: Naushir Patuck <naush@xxxxxxxxxxxxxxx>
Co-developed-by: Jean-Michel Hautbois <jeanmichel.hautbois@xxxxxxxxxxxxxxxx>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@xxxxxxxxxxxxxxxx>
Co-developed-by: Laurent Pinchart <laurent.pinchart@xxxxxxxxxxxxxxxx>
Signed-off-by: Laurent Pinchart <laurent.pinchart@xxxxxxxxxxxxxxxx>
---

[snip]

---
  MAINTAINERS                                   |    1 +
  drivers/media/platform/Kconfig                |    1 +
  drivers/media/platform/Makefile               |    1 +
  drivers/media/platform/broadcom/Kconfig       |   23 +
  drivers/media/platform/broadcom/Makefile      |    3 +
  .../platform/broadcom/bcm2835-unicam-regs.h   |  255 ++
  .../media/platform/broadcom/bcm2835-unicam.c  | 2607 +++++++++++++++++
  7 files changed, 2891 insertions(+)
  create mode 100644 drivers/media/platform/broadcom/Kconfig
  create mode 100644 drivers/media/platform/broadcom/Makefile
  create mode 100644 drivers/media/platform/broadcom/bcm2835-unicam-regs.h
  create mode 100644 drivers/media/platform/broadcom/bcm2835-unicam.c

[snip]

diff --git a/drivers/media/platform/broadcom/bcm2835-unicam.c b/drivers/media/platform/broadcom/bcm2835-unicam.c
new file mode 100644
index 000000000000..716c89b8a217
--- /dev/null
+++ b/drivers/media/platform/broadcom/bcm2835-unicam.c
@@ -0,0 +1,2607 @@

[snip]

+static int unicam_start_streaming(struct vb2_queue *vq, unsigned int count)
+{
+       struct unicam_node *node = vb2_get_drv_priv(vq);
+       struct unicam_device *unicam = node->dev;
+       struct v4l2_subdev_state *state;
+       struct unicam_buffer *buf;
+       unsigned long flags;
+       int ret;
+       u32 pad, stream;
+       u32 remote_pad = is_image_node(node) ? UNICAM_SD_PAD_SOURCE_IMAGE
+                                            : UNICAM_SD_PAD_SOURCE_METADATA;
+
+       /* Look for the route for the given pad and stream. */
+       state = v4l2_subdev_lock_and_get_active_state(&unicam->subdev.sd);
+       ret = v4l2_subdev_routing_find_opposite_end(&state->routing,
+                                                   remote_pad, 0,
+                                                   &pad, &stream);
+       v4l2_subdev_unlock_state(state);
+
+       if (ret)
+               goto err_return_buffers;
+
+       dev_dbg(unicam->dev, "Starting stream on %s: %u/%u -> %u/%u (%s)\n",
+               unicam->subdev.sd.name, pad, stream, remote_pad, 0,
+               is_metadata_node(node) ? "metadata" : "image");
+
+       /* The metadata node can't be started alone. */
+       if (is_metadata_node(node)) {
+               if (!unicam->node[UNICAM_IMAGE_NODE].streaming) {
+                       dev_err(unicam->dev,
+                               "Can't start metadata without image\n");
+                       ret = -EINVAL;
+                       goto err_return_buffers;
+               }

There's a slight change of behaviour in this function when compared to
the downstream/BSP non-streams enabled driver.

In the BSP driver, if the embedded data node has been enabled, we wait
for both image and embedded data nodes to have start_streaming()
called before starting the sensor (see
https://github.com/raspberrypi/linux/blob/c04af98514c26014a4f29ec87b3ece95626059bd/drivers/media/platform/bcm2835/bcm2835-unicam.c#L2559).
This is also the same for the Pi 5 CFE driver.

With the logic in this function, we only wait for start_streaming() on
the image node then start the sensor streaming immediately.  When
start_streaming() for the embedded data node is subsequently called,
we end up with the first N buffers missing and/or invalid as the HW
channel is enabled while the sensor is streaming.  I noticed this when
using libcamera where we start image then embedded node.  If I flip
things around (start embedded first then image), everything works as
expected.

Could we add back the test to ensure all nodes are streaming before
starting the sensor?

Yes, I don't think the current implementation is good. I'm not sure why
the logic got changed, but I'll address it in the next version.


I think I did it in v4. The aim of this change was to be able to start streaming on the image node even if the metadata node is not started. The opposite was not to be possible, so if the metadata node is started, then the image node must be started for streaming to begin. The downstream logic is that both nodes must always be started, but at the time of v4, not everything was ready for this, if I remember correctly.

JM

+
+               spin_lock_irqsave(&node->dma_queue_lock, flags);
+               buf = list_first_entry(&node->dma_queue,
+                                      struct unicam_buffer, list);
+               dev_dbg(unicam->dev, "buffer %p\n", buf);
+               node->cur_frm = buf;
+               node->next_frm = buf;
+               list_del(&buf->list);
+               spin_unlock_irqrestore(&node->dma_queue_lock, flags);
+
+               unicam_start_metadata(unicam, buf);
+               node->streaming = true;
+               return 0;
+       }
+
+       ret = pm_runtime_resume_and_get(unicam->dev);
+       if (ret < 0) {
+               dev_err(unicam->dev, "PM runtime resume failed: %d\n", ret);
+               goto err_return_buffers;
+       }
+
+       ret = video_device_pipeline_start(&node->video_dev, &unicam->pipe);
+       if (ret < 0) {
+               dev_dbg(unicam->dev, "Failed to start media pipeline: %d\n", ret);
+               goto err_pm_put;
+       }
+
+       spin_lock_irqsave(&node->dma_queue_lock, flags);
+       buf = list_first_entry(&node->dma_queue,
+                              struct unicam_buffer, list);
+       dev_dbg(unicam->dev, "buffer %p\n", buf);
+       node->cur_frm = buf;
+       node->next_frm = buf;
+       list_del(&buf->list);
+       spin_unlock_irqrestore(&node->dma_queue_lock, flags);
+
+       unicam_start_rx(unicam, buf);
+
+       ret = v4l2_subdev_enable_streams(&unicam->subdev.sd, remote_pad, BIT(0));
+       if (ret < 0) {
+               dev_err(unicam->dev, "stream on failed in subdev\n");
+               goto error_pipeline;
+       }
+
+       node->streaming = true;
+
+       return 0;
+
+error_pipeline:
+       video_device_pipeline_stop(&node->video_dev);
+err_pm_put:
+       pm_runtime_put_sync(unicam->dev);
+err_return_buffers:
+       unicam_return_buffers(node, VB2_BUF_STATE_QUEUED);
+       return ret;
+}

[snip]






[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]

  Powered by Linux