On 8/5/22 13:35, Hans Verkuil wrote: > Hi all, > > Here is some more information about the Media Summit: > > Date: Monday September 12 > Time: 9:00-17:00 > Location: Convention Centre Dublin > Room: The Liffey B - Part 1 (subject to change) > Sponsored by: Cisco Systems Norway and Collabora > > We will have a projector or display to show presentations, power strips, > a whiteboard and beverages. For lunch we are on our own. > > It's co-located with the OSS Europe conference: > > https://events.linuxfoundation.org/open-source-summit-europe/ > > Attendees: > > Sakari Ailus <sakari.ailus@xxxxxxxxxxxxxxx> > Kieran Bingham <kieran.bingham@xxxxxxxxxxxxxxxx> > Nicolas Dufresne <nicolas@xxxxxxxxxxxx> > Benjamin Gaignard <benjamin.gaignard@xxxxxxxxxxxxx> > Hidenori Kobayashi <hidenorik@xxxxxxxxxxxx> > Paul Kocialkowski <paul.kocialkowski@xxxxxxxxxxx> > Jacopo Mondi <jacopo@xxxxxxxxxx> > Laurent Pinchart <laurent.pinchart@xxxxxxxxxxxxxxxx> > Ricardo Ribalda <ribalda@xxxxxxxxxxxx> > Maxime Ripard <maxime@xxxxxxxxxx> > Daniel Scally <djrscally@xxxxxxxxx> > Jernej Škrabec <jernej.skrabec@xxxxxxxxx> > Dave Stevenson <dave.stevenson@xxxxxxxxxxxxxxx> (from 11 am onwards) > Hans Verkuil <hverkuil@xxxxxxxxx> > Philipp Zabel <p.zabel@xxxxxxxxxxxxxx> > > Note: there are 5 seats left, so if you are interested in this, mail me. > > The health and safety regulations will be those of the OSSE LF: > > https://events.linuxfoundation.org/open-source-summit-europe/attend/health-and-safety/ > > We strongly recommend that you do a self-test before going to the Conference Centre > for this meeting. > > Code of conduct: > > https://events.linuxfoundation.org/open-source-summit-europe/attend/code-of-conduct/ > > > Based on the submitted topics I have made a first draft of the agenda. I have tried > to keep the sensor-related topics to after 11:00 since Dave comes in later in the day. > > I am also making the (reasonable) assumption that most (if not all) attendees will be > attending the ELCE/OSSE conference Tue-Fri as well. While it is nice if we can come > to a conclusion in the time allotted for each topic, it's also OK if we can set up > a small group that can discuss it further in the following days. > > If you raised a discussion topic, but will be in Dublin for only the Monday, then > let me know. > > I added a guesstimate of the time needed for each topic. If you think that guesstimate > is wildly off, then let me know. But remember: it's fine if we decide to discuss it > further in the following days in a smaller group. > > If you present a topic, then please make a presentation. And if you have material you > can share beforehand, then that would be great. > > Draft Agenda V1: > > 9:00 Getting settled > 9:20 Introduction > 9:30 Hans: Presentation on CTA-861 & edid-decode > 9:45 Nicolas: Stateless encoder progress > 10:15 Ricardo: Introduce ChromeOS camera project > > 11:00 Break > > 11:15 Kieran: Fault tolerance > > I raised this in the past when we first started hitting the issue on > Renesas platforms with multiple cameras in a single media graph, but now > I think it's become more critical with desktop / laptop devices that are > hitting the issue (i.e. the IPU3). > > Summary of issue: > > - Multiple cameras that can function independently successfully, are > prevented from functioning or fully probing by V4L2 if one component > of another camera fails to load or probe. > > If Camera A has a VCM, and Camera B does not, Camera B will not be > available at all if Camera A's VCM is not fully probed, even though > Camera B can be fully functional and complete. > > Even if Camera A does not have the VCM probed, it may still function > successfully (with a fixed focal position) - but our current > implementation will mean that it will not even be available to > capture images. > > We talked about this quite a long time ago, and I believe the general > consensus was that we can have events on the media graph. But > unfortunately at the time, there was no development scheduled on that, > and it wasn't something I was able to continue at the time. > > I'd like to bring it up to refresh the topic, and see if we can make > some progress as it's now affecting more general devices. > > 11:45 Jacopo: Representing addition sensor processing stages. > > How to represent additional processing stages that happens > on the sensor side, mostly additional subsampling/cropping that happen > between the analogue cropping on the full pixel array and the final > image sent on the wire. > > https://lore.kernel.org/linux-media/CAPY8ntA06L1Xsph79sv9t7MiDSNeSO2vADevuXZdXQdhWpSmow@xxxxxxxxxxxxxx/ > > Dave made a good introduction of the issue his email which got > largely unanswered. > > The issue is particularly relevant for RAW sensors, where applying > subsampling has an impact on the sensor's sensitivity and requires > to adjust the gains and exposure accordingly. > > The V4L2 selection API falls short on this and the only other > solution I am aware of is registering additional subdevices as the > CCS driver does. > > 12:30 Lunch > > 13:30 Dave: On-sensor temperature reporting. > > Thread started by Benjamin at > https://lore.kernel.org/linux-media/20220415111845.27130-3-benjamin.mugnier@xxxxxxxxxxx/ > but no resolution over using hwmon API or V4L2 control. If hwmon > then we need Media Controller framework to tie the sensor and thermal > device together. > > It's recently been queried for IMX477 with the Pi > (https://github.com/raspberrypi/libcamera/issues/19), but it will > apply to many sensors. > > 13:45 Dave: Synchronising sensors for stereoscopic operation. > > How should that be configured? Allowing configuration from userspace > would allow sensors to be operated independently which can be useful for > test purposes, or should it be enforced from DT/ACPI? Do we set a default > configuration for each sensor from DT/ACPI and then allow userspace to > override should it wish? > > 14:00 Dave: Lens drivers. > > Each driver will have a "useful" range which is effectively dictated by > the overall module. Should that be defined via DT as it is a feature of > the platform, or leave the driver totally generic and expect userspace > to do something sensible? > > In the case of simple systems without libcamera, do we set default for > V4L2_CID_FOCUS_ABSOLUTE to a sensible hyperfocal distance, and can > that again be defined in DT as it is defining the hardware? > > 14:15 Dave: Controlling sensor GPIO outputs. > > Controlling sensor GPIO outputs for things such as flash triggers, > vsync, frame start/end, exposure start/end, etc. > > There is a huge range of features available so do we have any hope of > standardising some of it, or do we end up hiding these away in the > drivers with custom DT bindings to configure them? If we accept that > there will be variation, can we vaguely standardise what those > bindings look like? Or should these be V4L2 controls as things like > pulse widths may want to be adjusted by userspace? > > 14:30 Jacopo: Reconcile handling of regulator, gpios and clock on OF and ACPI platforms. > > We recently got a few series trying to reconcile handling of regulators, > gpios and clocks on OF and ACPI platforms all of them doing the usual > "similar but slightly different" thing: > > https://lore.kernel.org/linux-media/20220425061022.1569480-1-paul.elder@xxxxxxxxxxxxxxxx/ > https://lore.kernel.org/linux-media/20220329090133.338073-1-jacopo@xxxxxxxxxx/ > https://lore.kernel.org/linux-media/20220509143226.531117-1-foss+kernel@xxxxxxxxx/ > https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/commit/?id=0c2c7a1e0d69221b9d489bfd8cf53262d6f82446 > > ACPI and OF handles clocks slightly differently, and it is not clear > to me if ACPI based platform need explicit handling of > clocks/regulator or ACPI does "the right thing" by itself (I'm > afraid the answer is actually "it depends"). I'm ACPI illiterate > so I cannot propose anything meaningful but if anyone is interested > in discussing this further this might be a good time to do so ? > > 15:00 Break > > 15:30 Laurent: V4L2 streams series. > > I'd like to discuss the V4L2 streams series, in particular how to > upstream the parts that won't be upstream yet by mid-September. > Discussing the next steps would also be useful, as there's lots we could > build on top. > > 16:00 Laurent: How can we finalize conversion of v4l-utils to meson? > > 16:15-17:00 Anything else? One more topic: "Deprecate (and later remove) the last few videobuf version 1 drivers" Possibly also include drivers that do not use neither videobuf nor vb2, I'll have to check how many of those we have. Regards, Hans > > Regards, > > Hans