On Thu, Nov 05, 2015 at 05:59:29PM +0000, Opensource [Adam Thomson] wrote: > On November 05, 2015 14:59, Mark Brown wrote: > > > +- dlg,ldo-lvl : Required internal LDO voltage (mV) level > > > + [<1050>, <1100>, <1200>, <1400>] > > Why would this ever be anything other than the minimum voltage, and > > might we not want to vary it at runtime? > Normally you are correct and you'll want minimum voltage for the digital engine > but there may be the possibility for certain platform scenarios that to meet > timings a higher voltage is required. Would prefer to leave this in in case it > is required in the future. As a side, the default setting is the lowest voltage, > if this binding is not provided. That doesn't make a huge degree of sense - your non-specific "certain platform scenarios" are almost certainly going to be DVFS things where you need to boost the voltage of the LDO when the device needs to operate at particularly high frequencies in which case the driver really ought to just manage the voltage appropriately as required (ideally only boosting it at the times where the extra load exists). Unless you can be more specific about what the intended use is this just doesn't seem like something we should be specifying in DT. > > These look like DSP coefficients which I would therefore expect to be > > configurable at runtime via a binary control rather than specified in > > the DT - why are they in the DT? > The expectation was these would be set once and left for a platform, which is > why I added them to the DT. However there's no reason they couldn't be moved > to binary control. Having looked at other codecs, I assume SND_SOC_BYTES* would > be the preferred method for setting this kind of data? Yes.
Attachment:
signature.asc
Description: PGP signature