Re: Userspace API for controlling the focus of the Surface Go [2] main/back-camera

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 25/10/2021 12:06, Hans de Goede wrote:
> Hi All,
> 
> With my (and Dan's) kernel patch-series to enable the back camera on
> the Surface Go shaping up (and hopefully going upstream soon),
> the next step is to enable control of the focus lens for the back
> camera.
> 
> The focus is controlled through a separate i2c-client which is
> described by a 2nd I2cSerialBusV2 resource entry in the ACPI
> device for the ov8865 sensor. By default the kernel only instantiates
> an i2c-client for the first I2cSerialBusV2 resource entry for an
> ACPI device, getting an i2c-client for the 2nd one is easy and
> out of scope for this discussion.
> 
> The question which I have is, assuming we have the 2nd i2c-client
> instantiated and we have a i2c-driver binding to it, how do we
> represent the focus control to userspace.
> 
> I see 2 possible directions we can go here:
> 
> 1. Somehow inject an extra v4l2ctrl for this into the v4l2ctrl
> list of the sensor. AFAIK we don't have infra for this atm, but
> we could add some generic mechanism to do this to the v4l2-ctrls
> core. IMHO from a userspace pov this is the cleanest, but at the
> cost of some extra work / possible ugliness on the kernel side.
> 
> 2. Register a separate v4l2_subdev for the focus-ctrl and in
> some way provide information to userspace to which sensor this
> belongs.
> 
> I believe that both are valid approaches. So before diving into
> this I wonder what others are thinking about this.
> 
> Specific questions:
> 
> 1. Hans Verkuil, what do you think about adding
> support for another driver to inject ctrls into the ctrl
> list of another v4l2(sub)dev ? Maybe something like this
> already exists ? If not do you think this is feasible
> and desirable to add ?
> 
> 2. If we go with a separate v4l2_subdev, how do we communicate
> to which sensor the focus-control belongs to userspace ?

What is the bridge driver that controls the sensor? I would need to
know a bit more about the architecture.

Is it MC-centric? Or is everything controlled through a video device?

In the latter case you want the video device to inherit the controls of
the sensor and the focus sub-devices, that's supported by the control
framework.

In the MC-centric case you probably want to have libcamera support that
can tie the focus subdev and the sensor subdev together.

v4l2_async_nf_parse_fwnode_sensor() supports linking LEDs or lens-focus
devices together with the sensor, so that's how a focus device can be
associated with a sensor at the ACPI/DT level. So support for 2) is
already available. A separate subdev is in my view certainly the correct
approach.

Regards,

	Hans



[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]

  Powered by Linux