On Fri, Apr 23, 2010 at 11:06 PM, Mark Brown <broonie@xxxxxxxxxxxxxxxxxxxxxxxxxxx> wrote: > On Wed, Apr 21, 2010 at 05:36:47PM +0800, Barry Song wrote: >> Signed-off-by: Barry Song <21cnbao@xxxxxxxxx> > >> +static int ad193x_set_dai_pll(struct snd_soc_dai *codec_dai, >> + int pll_id, int source, unsigned int freq_in, unsigned int freq_out) >> +{ >> + struct snd_soc_codec *codec = codec_dai->codec; >> + int reg; >> + >> + reg = snd_soc_read(codec, AD193X_PLL_CLK_CTRL0); >> + >> + switch (freq_in) { >> + case 12288000: >> + reg = (reg & AD193X_PLL_INPUT_MASK) | AD193X_PLL_INPUT_256; >> + break; > > This all looks like you should just implement set_sysclk() not set_pll() > - from the user point of view the fact that a PLL ends up getting tuned > is immaterial here, they can't control the output frequency at all. > From their point of view they just specify the clock going into the > CODEC and the driver works out everything else for them. Sorry for I have not noticed this entry. Then i'll use set_sysclk() to set the MCLK instead of set_pll(). > _______________________________________________ Alsa-devel mailing list Alsa-devel@xxxxxxxxxxxxxxxx http://mailman.alsa-project.org/mailman/listinfo/alsa-devel