Re: [ANN] Media Summit at ELCE Dublin, September 12: Draft Agenda V2

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi all,

Le mardi 23 août 2022 à 12:53 +0200, Hans Verkuil a écrit :
> Hi all,
> 
> Here is some more information about the Media Summit:
> 
> Date: Monday September 12
> Time: 8:45-18:00
> Location: Convention Centre Dublin
> Room: The Liffey B - Part 1 (subject to change)
> Sponsored by: Cisco Systems Norway, Collabora and the Kodi Foundation

[...]

> Draft Agenda V2:
> 
>  8:40 Getting settled
>  9:00 Introduction
>  9:10 Hans: Presentation on CTA-861 & edid-decode
>  9:25 Nicolas: Stateless encoder progress

Took me a while to wakeup on that one, I didn't have much travel options and I'm
landing on the 12th but only at 11:30am. Do you mind moving this one in the
afternoon ?

> 10:00 Ricardo: Introduce ChromeOS camera project
> 
> 11:00 Break
> 
> 11:15 Kieran: Fault tolerance
> 
>   I raised this in the past when we first started hitting the issue on
>   Renesas platforms with multiple cameras in a single media graph, but now
>   I think it's become more critical with desktop / laptop devices that are
>   hitting the issue (i.e. the IPU3).
> 
>   Summary of issue:
> 
>   - Multiple cameras that can function independently successfully, are
>     prevented from functioning or fully probing by V4L2 if one component
>     of another camera fails to load or probe.
> 
>     If Camera A has a VCM, and Camera B does not, Camera B will not be
>     available at all if Camera A's VCM is not fully probed, even though
>     Camera B can be fully functional and complete.
> 
>     Even if Camera A does not have the VCM probed, it may still function
>     successfully (with a fixed focal position) - but our current
>     implementation will mean that it will not even be available to
>     capture images.
> 
>   We talked about this quite a long time ago, and I believe the general
>   consensus was that we can have events on the media graph. But
>   unfortunately at the time, there was no development scheduled on that,
>   and it wasn't something I was able to continue at the time.
> 
>   I'd like to bring it up to refresh the topic, and see if we can make
>   some progress as it's now affecting more general devices.
> 
> 11:45 Jacopo: Representing addition sensor processing stages.
> 
>   How to represent additional processing stages that happens
>   on the sensor side, mostly additional subsampling/cropping that happen
>   between the analogue cropping on the full pixel array and the final
>   image sent on the wire.
> 
>   https://lore.kernel.org/linux-media/CAPY8ntA06L1Xsph79sv9t7MiDSNeSO2vADevuXZdXQdhWpSmow@xxxxxxxxxxxxxx/
> 
>   Dave made a good introduction of the issue his email which got
>   largely unanswered.
> 
>   The issue is particularly relevant for RAW sensors, where applying
>   subsampling has an impact on the sensor's sensitivity and requires
>   to adjust the gains and exposure accordingly.
> 
>   The V4L2 selection API falls short on this and the only other
>   solution I am aware of is registering additional subdevices as the
>   CCS driver does.
> 
> 12:30 Lunch
> 
> 13:30 Dave: On-sensor temperature reporting.
> 
>   Thread started by Benjamin at
>   https://lore.kernel.org/linux-media/20220415111845.27130-3-benjamin.mugnier@xxxxxxxxxxx/
>   but no resolution over using hwmon API or V4L2 control. If hwmon
>   then we need Media Controller framework to tie the sensor and thermal
>   device together.
> 
>   It's recently been queried for IMX477 with the Pi
>   (https://github.com/raspberrypi/libcamera/issues/19), but it will
>   apply to many sensors.
> 
> 13:50 Dave: Synchronising sensors for stereoscopic operation.
> 
>   How should that be configured? Allowing configuration from userspace
>   would allow sensors to be operated independently which can be useful for
>   test purposes, or should it be enforced from DT/ACPI? Do we set a default
>   configuration for each sensor from DT/ACPI and then allow userspace to
>   override should it wish?
> 
> 14:10 Dave: Lens drivers.
> 
>   Each driver will have a "useful" range which is effectively dictated by
>   the overall module. Should that be defined via DT as it is a feature of
>   the platform, or leave the driver totally generic and expect userspace
>   to do something sensible?
> 
>   In the case of simple systems without libcamera, do we set default for
>   V4L2_CID_FOCUS_ABSOLUTE to a sensible hyperfocal distance, and can
>   that again be defined in DT as it is defining the hardware?
> 
> 14:30 Dave: Controlling sensor GPIO outputs.
> 
>   Controlling sensor GPIO outputs for things such as flash triggers,
>   vsync, frame start/end, exposure start/end, etc.
> 
>   There is a huge range of features available so do we have any hope of
>   standardising some of it, or do we end up hiding these away in the
>   drivers with custom DT bindings to configure them? If we accept that
>   there will be variation, can we vaguely standardise what those
>   bindings look like? Or should these be V4L2 controls as things like
>   pulse widths may want to be adjusted by userspace?
> 
> 15:00 Break
> 
> 15:30 Jacopo: Reconcile handling of regulator, gpios and clock on OF and ACPI platforms.
> 
>   We recently got a few series trying to reconcile handling of regulators,
>   gpios and clocks on OF and ACPI platforms all of them doing the usual
>   "similar but slightly different" thing:
> 
>   https://lore.kernel.org/linux-media/20220425061022.1569480-1-paul.elder@xxxxxxxxxxxxxxxx/
>   https://lore.kernel.org/linux-media/20220329090133.338073-1-jacopo@xxxxxxxxxx/
>   https://lore.kernel.org/linux-media/20220509143226.531117-1-foss+kernel@xxxxxxxxx/
>   https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/commit/?id=0c2c7a1e0d69221b9d489bfd8cf53262d6f82446
> 
>   ACPI and OF handles clocks slightly differently, and it is not clear
>   to me if ACPI based platform need explicit handling of
>   clocks/regulator or ACPI does "the right thing" by itself (I'm
>   afraid the answer is actually "it depends"). I'm ACPI illiterate
>   so I cannot propose anything meaningful but if anyone is interested
>   in discussing this further this might be a good time to do so ?
> 
> 
> 16:00 Laurent: V4L2 streams series.
> 
>   I'd like to discuss the V4L2 streams series, in particular how to
>   upstream the parts that won't be upstream yet by mid-September.
>   Discussing the next steps would also be useful, as there's lots we could
>   build on top.
> 
> 16:30 Laurent: How can we finalize conversion of v4l-utils to meson?
> 
> 16:45-18:00 Anything else?
> 
> Regards,
> 
> 	Hans





[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]

  Powered by Linux