On Wed, 20 Nov 2013 09:29:44 -0800, Guenter Roeck wrote: > Mike's graph is quite interesting - it shows that the temperature reading error > is linear, at least for his CPU. Unfortunately, I don't think we can use > that knowledge to "fix" the reading automatically, as the error is very likely > different for other CPUs. We might consider adding an ideality factor module > parameter, though. What do you think about that ? Everyone can compute the formula and use libsensors to apply it. If the user has to provide the value manually for each CPU sample then it might as well be that way, no need to add a module parameter. A single module parameter would additionally become a problem for multi-socket systems, you'd need an array and a reliable way to map each entry to the logical CPUs of a given socket (assuming the ideality factor is per package... which may not always be true.) > Another question is what temperature to use as tjmin. If we add an ideality > factor module parameter, it could be quite low, such as 20 degrees C. > We could even calculate tjmin based on the ideality factor if specified. > tjmin = tjmax - (tjmax * ideality_factor / 100); /* ideality_factor in % */ > > Otherwise I would prefer something higher, at least 30 degrees C. Personally I'd just do the minimum to avoid returning an error. In other words I'd be fine returning values down to 6 degrees (for the Atom D510 at least). We know the value is wrong but it can be corrected in user-space, while if we clamp higher, it can no longer be corrected. -- Jean Delvare _______________________________________________ lm-sensors mailing list lm-sensors@xxxxxxxxxxxxxx http://lists.lm-sensors.org/mailman/listinfo/lm-sensors