Hi Jonathan,
On 24. 03. 24 14:55, Jonathan Cameron wrote:
On Wed, 20 Mar 2024 11:04:04 +0100
Andrej Picej <andrej.picej@xxxxxxxxx> wrote:
Hi all,
we had some problems with failing ADC calibration on the i.MX93 boards.
Changing default calibration settings fixed this. The board where this
patches are useful is not yet upstream but will be soon (hopefully).
Tell us more. My initial instinct is that this shouldn't be board specific.
What's the trade off we are making here? Time vs precision of calibration or
something else? If these are set to a level by default that doesn't work
for our board, maybe we should just change them for all devices?
So we have two different boards with the same SoC. On one, the
calibration works with the default values, on the second one the
calibration fails, which makes the ADC unusable. What the ADC lines
measure differ between the boards though. But the implementation is
nothing out of the ordinary.
We tried different things but the only thing that helped is to use
different calibration properties. We tried deferring the probe and
calibration until later boot and after boot, but it did not help.
In the Reference Manual [1] (chapter 72.5.1) it is written:
4. Configure desired calibration settings (default values kept for highest accuracy maximum time).
So your assumption is correct, longer calibration time (more averaging
samples) -> higher precision. The default values go for a high accuracy.
And since we use a NRSMPL (Number of Averaging Samples) of 32 instead of
default 512, we reduce the accuracy so the calibration values pass the
internal defined limits.
I'm not sure that changing default values is the right solution here. We
saw default values work with one of the boards. And since the NXP kept
these values adjustable I think there is a reason behind it.
Note: When I say one of the boards I mean one board form. So same board
version, but different HW.
Best regards,
Andrej
[1] i.MX 93 Applications Processor Reference Manual, Rev. 4, 12/2023