Re: [RFC PATCH v3 3/9] power: supply: Support DT originated temperature-capacity tables

From: Vaittinen, Matti
Date: Thu Dec 02 2021 - 01:29:10 EST


On 12/2/21 03:57, Linus Walleij wrote:
> On Tue, Nov 30, 2021 at 7:33 AM Vaittinen, Matti
> <Matti.Vaittinen@xxxxxxxxxxxxxxxxx> wrote:
>
>> Well, I don't know how constant such degradation is over time. I just
>> guess it is not constant but might be proportional to age-compensated
>> capacity rather than the designed capacity. It'd be nice to use correct
>> approximation of reality in device-tree...
>
> IIUC the degradation of a battery is related to number of full charge cycles,
> i.e. the times that the battery has been emptied and recharged fully.
> This is of course never happening in practice, so e.g. two recharge cycles
> from 50% to 100% is one full charge cycle. So you integrate this
> over time (needs to be saved in a file system or flash if the battery does
> not say it itself).

Yes.

> This measures how much the lithium ions have moved around in the
> electrolyte and thus how much chemical interaction the battery has
> seen.
>
> Then the relationship between complete charge cycles and capacity
> degradation is certainly also going to be something nonlinear so it
> needs manufacturer data for the battery.

Right. As far as I understand, at least part of the 'aging degradation'
comes from the fact that battery materials are 'vaporizing' when battery
is charged. And as far as I understand, the temperature in which
charging occurs has a big impact on this. Eg, higher the temperature
where you do charging, worse the degradation. Which means that the cycle
count should actually be weighed by the charging temperature.

But this kind of missed my point :) I was thinking of how to give the
absolute (uAh) value of capacity drop caused by the temperature. My
original RFC patch gave this as linear change of absolute uAh's at a
temperature range.

As you pointed, we should not include the linearity in the DT model. So
the next step would be to just give measured value pairs (should be done
by battery vendor or computed by some theoretical basis) of absolute
uAh/temperature - and leave fitting of the data points to be done by SW.

What I was now considering is that maybe the capacity drop (in uAhs)
caused by the temperature change - is not the same for new and old
battery. It sounds more logical to me that the capacity drop caused by
the temperature is proportional to the maximum capacity battery is
having at that point of it's life. Eg, if new battery can hold 80 units
of energy, and drops 20 units of energy when temperature changes from T0
=> T1 - an badly aged battery which now only can hold 40 units would
lose only 10 units at that same temperature drop T0 => T1. I was
wondering if such an assumption is closer to the truth than saying that
bot of the batteries would lose same 20 units - meaning that the new
battery would lose 25% of energy at temperature drop T0 => T1 but old
one would lose 50% of the capacity. I somehow think both of the
batteries, old and new, would lose same % of capacity at the temperature
change.

So, if this assumption is correct, then we should give the temperature
impact as proportion of the full capacity taking the aging into account.
So if we happen to know the aging impact to the capacity, then software
should use aging compensation prior computing the temperature impact. If
aging information or impact is not known, then designed capacity can be
used as a fall-back, even though it means we will probably be somewhat
off for old batteries.

My problem here is that I just assume the impact of temperature is
proportional to the full-capacity which takes the aging into account.
Knowing how this really is would be cool so we could get the temperature
impact modelled correctly in DT.

>> By the way, I was reading the ab8500 fuel-gauge driver as you suggested.
>> So, if I am correct, you used the battery internal resistance + current
>> to compute the voltage-drop caused by battery internal resistance. This
>> for sure improves the accuracy when we assume VBAT can be used as OCV.
>
> Yes this is how it is done. With a few measurements averaged to
> iron out the noise.
>
>> Epilogue:
>> In general, I see bunch of power-supply drivers scheduling work for
>> running some battery-measurements. Some do this periodically (I think
>> the ab8500 did this although I lost the track when I tried to see in
>> which case the periodic work was not scheduled - and maybe for
>> fuel-gauging) or after an IRQ is triggered (for example to see if change
>> indication should be sent).
>
> Yes there is some tight community of electronic engineers who read the
> right articles and design these things. We don't know them :(

Right. By the way, I heard tha the TI has patent protecting some type of
battery internal resistance usage here. OTOH, ROHM has patent over some
of the VDROP value table stuff. Occasionally it feels like the ice is
getting thinner at each step here. :/

Best Regards
Matti Vaittinen

--
The Linux Kernel guy at ROHM Semiconductors

Matti Vaittinen, Linux device drivers
ROHM Semiconductors, Finland SWDC
Kiviharjunlenkki 1E
90220 OULU
FINLAND

~~ this year is the year of a signature writers block ~~