On Wed, Feb 27, 2019 at 6:47 AM Russell King - ARM Linux admin <linux@xxxxxxxxxxxxxxx> wrote: > > Given that, we do need some way to validate the bclk_ratio when it is > set, and not during hw_params which would (a) lead to ALSA devices > failing when userspace is negotiating the parameters and (b) gives no > way in the kernel for set_bclk_ratio() to discover whether a particular > ratio is supported by the codec. > > So, I think there's three possible ways around this: > 1. adding a set_bclk_ratio() method to hdmi_codec_ops > 2. calling hw_params() when our set_bclk_ratio() method is called > (what if the rest of the format isn't set or incompatible, which > may cause the hw_params() method to fail?) > 3. adding a list of acceptable bclk_ratio values to hdmi_codec_pdata > Again excuse my obvious ignorance, but would these solutions work well in the more general case? Imagine a cpu dai that supports 16x2/16x2, 20x2/20x2, 24x2/24x2 (sample bits/frame bits -- wire format), sending audio to our tda998x hdmi xmitter. Depending on the type of samples that userspace chooses to play, one of the above formats gets selected by the ASoC core, resulting in a bclk_ratio of 16x2, 20x2 or 32x2. It's up to the card driver to call set_bclk_ratio(), right? So now this card driver needs intimate knowledge of bclk_ratios vs formats for the cpu dai. Also it needs knowledge of which bclk_ratios are supported by the hdmi xmitter, and a mechanism to filter our the 20x2 blk_ratio format. Which may not be trivial, and also prevents it from being generic, i.e. we can no longer use simple-card ? But it gets worse. Imagine a hypothetical cpu dai that supports 20x2/20x2 and 20x2/24x2. When the dai is sending to a codec that doesn't care about bclk_ratio, it should pick 20x2/20x2, because that's most efficient, right? Except on a tda998x it should select 20x2/24x2. So how would a card driver now even begin to deal with this, given that there appears to be no mechanism to even describe these differences? Because the params_physical_width() describes the memory format, and not the wire format, correct? So all this kind of suggests to me that the bclk_ratio could be part of the format description, or something? static struct snd_soc_dai_driver acme_cpu_dai = { .playback = { .formats = SNDRV_PCM_FMTBIT_S20_3LE_20 | SNDRV_PCM_FMTBIT_S20_3LE_24, SNDRV_PCM_FMTBIT_S16_LE | // bclk_ratio 16 implied SNDRV_PCM_FMTBIT_S24_LE | // bclk_ratio 24 implied SNDRV_PCM_FMTBIT_S24_LE_32 }, }; static struct snd_soc_dai_driver hdmi_dai = { .playback = { .formats = SNDRV_PCM_FMTBIT_S20_3LE_24 | SNDRV_PCM_FMTBIT_S16_LE | // bclk_ratio 16 implied SNDRV_PCM_FMTBIT_S24_LE | // bclk_ratio 24 implied SNDRV_PCM_FMTBIT_S24_LE_32, }, }; In this case, the capabilities get negotiated correctly, and the tda998x's hw_params() could just call params_bclk_ratio(params) to get the ratio, right? And the fsl_ssi could resort to a rule to filter out all non-32x2 bclk_ratio formats only in master mode. As I said, I'm way out of my depth here. No idea how realistic or hypothetical this is, or if this would go against the grain of the existing ASoC architecture. I really hope there's a much better solution than this that will solve the general case. _______________________________________________ Alsa-devel mailing list Alsa-devel@xxxxxxxxxxxxxxxx https://mailman.alsa-project.org/mailman/listinfo/alsa-devel