Hi dee Ho again, On 11/18/21 08:11, Matti Vaittinen wrote: > Hi Linus, > > On 11/18/21 04:10, Linus Walleij wrote: >> On Tue, Nov 16, 2021 at 1:26 PM Matti Vaittinen >> <matti.vaittinen@xxxxxxxxxxxxxxxxx> wrote: >> >>> Support obtaining the "capacity degradation by temperature" - tables >>> from device-tree to batinfo. >>> >>> Signed-off-by: Matti Vaittinen <matti.vaittinen@xxxxxxxxxxxxxxxxx> >> >> Same questions as on the binding patch. >> >> If we already support different degradation by temperature tables, >> why do we need a second mechanism for the same thing? > > Thanks for bringing this up. As I said, I didn't notice that we could > indeed use the CAP-OCV tables for different temperatures to bring in > this information :) I see certain benefit from the possibility of not > requiring to measure the OCV at different temperatures - but it may not > be meaningful. As I replied to your patch 1/9 review - I need to (try > to) do some more research... I tried doing some pondering here today. Unfortunately, the Friday afternoon is probably the worst time to try this - my brains cease operating at the afternoon - and double so at the Friday. Friday afternoons are good for babbling via email though ;) I don't see providing OCV tables at different temperature gives the degradation of battery capacity. Whoah. A big thought for Friday. We get the OCV => SOC correspondance at different temperatures. I however don't see how this gives the OCV => energy relation. As far as I know both the OCV and the 'amount of uAhs battery is able to store' are impacted by temperature change. This means, seeing the OCV => SOC at different temperatures does not tell us what is the impact of temperature to the OCV, and what is the impact to SOC. For cases like the ROHM Chargers, we are interested on how much has the 'ability to store uAhs' changed due to the temperature. When we know the amount of uAhs we can store, we can use the coulomb counter value to estimate what we still have left in the battery. In addition to this we do use the OCV information for the "nearly depleted battery" - to improve the estimation by zero-correction algorithm. I must admit Friday afternoon is not the time I can quite recap this part. I think it was something like: 1. Measure VBat with system load (VBAT) 2. Find OCV corresponding the current SOC estimate (SOC based on coulomb counter value) - OCV_NOW 3. Compute VDROP caused by the load (OCV_NOW - VBAT) 4. Assume VDROP stays constant (or use ROHM VDR parameters if provided) 5. Using VDROP compute the OCV_MIN which matches the minimum battery voltage where system is still operational 6. Use the OCV_MIN and "OCV at SOC0 from calibration data" difference to adjust the battery capacity. (Explanation done at Friday afternoon accuracy here). >> I'd just calculate a few tables per temperature and be done with >> it. >> >> At least documentation needs to be updated to reflect that the two >> methods >> are exclusive and you can only use one of them. I don't see these exclusive (at Friday afternoon at least). I think they can complement each-others. The temp_degradation table gives us the temperature impact on <energy storing ability>, eg, how much the battery capacity has changed from designed one due to the temperature. OCV-SOC tables at various temperatures tell us how OCV looks like when we have X% of battery left at different temperatures. Estimation of how much the X% is in absolute uAhs can be done by taking into account the designed_cap, aging degradation and the temperature degradation (and the position of moon, amount of muons created by cosmic rays hitting athmosphere at knee energy region and so on...) Or am I just getting something terribly wrong (again)? :) (I still for example like internal functions named as __foo() ) Yours --Matti -- The Linux Kernel guy at ROHM Semiconductors Matti Vaittinen, Linux device drivers ROHM Semiconductors, Finland SWDC Kiviharjunlenkki 1E 90220 OULU FINLAND ~~ this year is the year of a signature writers block ~~