RE: Question about soc_pcm_apply_msb()

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Lars-Peter

Thank you for your feedback

>> I wonder do we need both (X) (Y) ?
>> I think we can merge (A) and (B) (= find Codec/CPU max sig_bits),
>> and call soc_pcm_set_msb() once, but am I misunderstand ?
> We need both. Or alternatively you could write 
> soc_pcm_set_msb(substream, min(bits, cpu_bits)).
>
> What this does is it computes the maximum msb bits from both the CPU 
> side and the CODEC side and then sets the msb bits reported to userspace 
> to the minimum of the two.
>
> The largest number of MSBs we'll see on the CODEC side is the max() and 
> the largest number of MSBs we'll see on the CPU side is the max(). And 
> the number of MSBs that the application will be able to see is the 
> smaller of the two.

Oh, yes. thank you for explaining details.
I think snd_pcm_hw_rule_msbits() was the point.

Best regards
---
Kuninori Morimoto




[Index of Archives]     [ALSA User]     [Linux Audio Users]     [Pulse Audio]     [Kernel Archive]     [Asterisk PBX]     [Photo Sharing]     [Linux Sound]     [Video 4 Linux]     [Gimp]     [Yosemite News]

  Powered by Linux