Headset support

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Friday 06 April 2018 10:20:42 David Woodhouse wrote:
> 
> 
> On Wed, 2018-04-04 at 18:53 +0200, Pali Rohár wrote:
> > On Wednesday 04 April 2018 16:47:57 David Woodhouse wrote:
> > > I've now counted three types of headset control that we should support,
> > > ideally through a consistent interface.
> > > 
> > > 
> > >  � The first is Bluetooth HFP/HSP for which support is already present 
> > >    and just needs to be connected up.
> > > 
> > >  � Second is the USB HID devices, including most "Skype for Business"
> > >    certified headsets. I have a Pidgin plugin which drives these 
> > >    directly, but it would be better for PulseAudio to open the HID 
> > >    device for itself and for the controls to be associated with the
> > >    specific hardware.
> > > 
> > >  � Third is the Android/etc. 3.5mm jack where button presses are
> > >    implemented as short-circuit or specific resistances from the mic
> > >    pin to ground:
> > >    https://source.android.com/devices/accessories/headset/plug-headset-spec
> > >    The Linux kernel has support for these (at least for a few codec 
> > >    chips), and they appear as events on an input device along with the 
> > >    jack insertion/removal events. Which I note we also don't support in
> > >    PA yet? Although there were patches in 2011 at 
> > >    https://www.mail-archive.com/pulseaudio-discuss at mail.0pointer.de/msg09830.html
> > > 
> > > 
> > > Are there any more?
> 
> Thanks.
> 
> > Nokia ECI headsets - 3.5mm jack with bi-directional ECI bus on MIC bias.
> > In most cases those headsets contain buttons, but ECI bus supports also
> > some memory read/write operations. But IIRC it is not possible to use
> > them with ordinary sound cards (activating MIC bias is too slow for ECI)
> > and Nokia implemented ECI protocol in their phones at ASIC level, some
> > translation of ECI to I2C. ECI protocol itself is undocumented, but I
> > saw some images from oscilloscope from which protocol could be decoded.
> 
> That one I think is a kernel problem. If implemented in Linux, it would
> presumably end up appearing to userspace just the same way as #3 above,
> with a separate input device emitting events for the button presses.
> Userspace doesn't necessarily need to care, right?

I would stick with fact that it is currently unsupported. I can imagine
fully software implementation in userspace e.g. very good sound card
where sound software would do sampling of microphone input and would do
whole decoding.

But in any case (once support will be there) it results in input device
and it would have exactly same problems and you wrote for USB headsets
below.

> > Other Nokia headsets (non-ECI) - again 3.5mm jack, but supports only one
> > button press. Maybe similar to your "third" one.
> 
> Yeah, probably. And either way, I think it's still the kernel's problem
> to work out the hardware details and just emit events.

Example is Nokia N900. Here kernel creates one input device and send
BTN_something event when button is pressed. There is just one button and
what pressing it means is up to the userspace. It can be used e.g. for
accepting voice call or acts as play/pause...

> > Bluetooth A2DP with AVRCP - but this should be already supported.
> 
> With the possible exception of volume control, I suspect AVRCP support
> is mostly outside the scope of PulseAudio. For volume controls we might
> want to ensure that they affect the *correct* device volume, in the
> A2DP+AVRCP case where there clearly is a "correct" device. But not the
> case of a pure AVRCP remote control.

Yes, I agree.

> We have this problem with volume control on USB headsets already. They
> provide a HID device which emits KEY_VOLUMEUP / KEY_VOLUMEDOWN events.
> So right now as I'm listening to a podcast on my headset, if I press a
> volume button not only does it immediately change the headset volume,
> but the system *also* sees that "keypress" and adjusts the volume of
> the laptop's built-in speakers too. Which is wrong. That volume
> keypress should have been handled in the context of the device it came
> from. We *do* get that right for HFP, I believe.
> 
> Other than volume control, the main common headset control features
> are:
> 
>  � on/off hook.
>  � mute (mic).
>  � ring
>  � Additional feature button(s)
> 
> At least for the SfB-certified USB headsets, the hook and mute controls
> need management. When the mute/hook buttons are pressed, userspace has
> to update the mute/hook status on the device accordingly, otherwise
> things don't work right (subsequent presses are ignored, etc.).
> 
> I suspect the correct approach is revive the patches which opened the
> input device for the jack, then do something similar to open the HID
> dev associated with a USB headset. Then to define hook/mute/ring
> properties and get signals working on the PA side, and hook that up to
> at least the GStreamer pulses{rc,ink} elements.
> 
> Then applications like Pidgin and Ekiga can do a saner version of the
> headset management, through GStreamer.

This opens question about another problem: Who, how and when should
handle events from input devices? Integrated PS/2 keyboard on laptop is also
input device, external USB (or PS/2) keyboard on desktop is also input
device. And they could have also MIC_MUTE or MUTE buttons. Should these
buttons mute all sound cards? Or only those which are "integrated" (e.g.
not bluetooth headsets)? And who should handle these mute buttons?
Pulseaudio? Desktop hotkey daemon (e.g. KDE has own)? Or some new daemon
for these actions? Because currently e.g. KDE already handles mute
buttons/keys by its own.

I agree with you, that mic mute button on USB/bluetooth headset should
mute just microphone on that headset.

-- 
Pali Rohár
pali.rohar at gmail.com


[Index of Archives]     [Linux Audio Users]     [AMD Graphics]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]

  Powered by Linux