On Mon, Dec 21, 2020 at 5:15 PM Guenter Roeck <linux@xxxxxxxxxxxx> wrote: > > - ret = iio_convert_raw_to_processed(channel, raw, &uv, 1000); > > - if (ret < 0) { > > - /* Assume 12 bit ADC with vref at pullup_uv */ > > - uv = (pdata->pullup_uv * (s64)raw) >> 12; > > + /* > > + * FIXME: This fallback to using a raw read and then right > > + * out assume the ADC is 12 bits and hard-coding scale > > + * to 1000 seems a bit dangerous. Should it simply be > > + * deleted? > > + */ > > The hwmon ABI specifically supports unscaled values, which can then be > scaled in userspace using the sensors configuration file. > Given that we return the pseudo-scaled value to userspace today, > it seems to me that it would do more harm to change that instead of just > leaving it in place. I see. I tried to drill down and see the history of the driver and in the original commit all values are scaled with the function get_temp_mC() which indicates that the driver has always intended to return millicentigrades, not unscaled values (as far as I can tell). First commit 9e8269de100dd0be1199778dc175ff22417aebd2 "hwmon: (ntc_thermistor) Add DT with IIO support to NTC thermistor driver" adds the result >>= 12 thing on top of a raw IIO read, so here is a silent assumption of a 12 bit ADC. Then commit 0315253b19bbc63eedad2f6125c21e280c76e29b "hwmon: (ntc_thermistor) fix iio raw to microvolts conversion" calls iio_convert_raw_to_processed() to get around the 12 bit assumption, instead adding a 1000x scale assumption on the value passed in from the raw read. This just looks wrong, why would it be 1000 and not 1 like the IIO core does when we call iio_read_channel_processed()? It looks like it is actually compensating for a bug in the ADC returning the wrong scale: the author may have used a buggy ADC driver return a scaling to volts instead of millivolts and then this was a trial-and-error solution to that bug in the ADC driver. In that case it would be nice to know which ADC driver, so we can fix it! I suspect maybe an out-of-tree ADC? So there is first a hardcoded solution to handle raw 12 bit values and then a trial-and-error fix for 1000x scaling in the code that was supposed to make it generic. The actual situation I have is that I used the thermistor for a thermal zone, and AFAICT these require that the temperature be in millicelsius, at least it will be very hard to handle the thermal zone in the device tree that define the trip points in millicelsius, so since the driver sets HWMON_C_REGISTER_TZ it might break the thermal zone ABI rather than the hwmon/sensors ABI, right? At least commit c08860ffe5c0e986e208e8217dae8191c0b40b24 "hwmon: (ntc_thermistor) Add ntc thermistor to thermal subsystem as a sensor" adds a thermal zone calling the function get_temp_mc() clearly requiring a millicentigrade measure. I also think it is indeed resulting in millicelsius on the platform that commit 0315253b19bbc63eedad2f6125c21e280c76e29b is intended for, it's just that it fixes a bug in the wrong place... > Either case, calling iio_convert_raw_to_processed() does not seem to add > value here. This is already done by iio_read_channel_processed(). > The best we can do is to use the original fallback. I don't know if the patch is especially messy but this part inside the if (ret < 0) clause: > > + ret = iio_read_channel_raw(channel, &raw); > > + if (ret < 0) { > > + pr_err("read channel() error: %d\n", ret); > > + return ret; > > + } > > + ret = iio_convert_raw_to_processed(channel, raw, &uv, 1000); > > + if (ret < 0) { > > + /* Assume 12 bit ADC with vref at pullup_uv */ > > + uv = (pdata->pullup_uv * (s64)raw) >> 12; > > + } shold be identical to the original fallback. If you want me to revise the patch in some way, I can fix, to me it looks like the fallback will just work on a certain platform with erroneous 1000x offset scaling, it would be good to know which one so we can fix the actual problem. Yours, Linus Walleij