On 5/12/2016 1:15 AM, Mark Brown wrote:
On Tue, May 10, 2016 at 11:18:03AM +0800, John Hsu wrote:
On 5/10/2016 12:35 AM, Mark Brown wrote:
Well, the machine driver has to cope anyway. What's not clear to me is
if the device *has* to use the internal clock when doing accessory
detection or if it's just lower power.
If the codec only does accessory insertion detection, the driver can setup
it and doesn't need any clock. That can make lower power. But if the codec
wants to do advanced jack detection like mic or impedance, the driver needs
the internal clock to setup auto detection. Thus, when no headset connected
yet, we use the solution without internal clock for power saving. When we
want to do advanced detection, we use the solution with internal clock.
I'm afraid this still leaves me none the wiser, sorry. If this
switching to the internal clock is essential to the device operation
then I'd expect it to be made transparent to callers so it should
happen transparently rather than appearing via set_sysclk(). If it's
not and it's just a performance optimisation then erroring out is
definitely excessive but if the optimisation can be implemented
transparently then it might be nice to do that.
In the driver patch, the internal clock switching is transparently
done by driver when the codec runs advanced jack detection. Our
purpose is to prevent the machine turns on the internal clock by
itself when no headset connected. That will not affect function
work but make more power consumption. Maybe we can change to clock
disabled quietly when the machine turns on the internal clock when
no headset connected. Is it the right way?
_______________________________________________
Alsa-devel mailing list
Alsa-devel@xxxxxxxxxxxxxxxx
http://mailman.alsa-project.org/mailman/listinfo/alsa-devel