Batt_fs_voltsrc - How is Sag-Compensated Voltage Calculated?

Can someone help me understand the BATT_FS_VOLTSRC parameter?

How is the “Sag Compensated Voltage” calculated?

1 Like

I found this:

/// get voltage with sag removed (based on battery current draw and resistance)
/// this will always be greater than or equal to the raw voltage
float AP_BattMonitor::voltage_resting_estimate(uint8_t instance) const
    if (instance < _num_instances && drivers[instance] != nullptr) {
        return drivers[instance]->voltage_resting_estimate();
    } else {
        return 0.0f;

…But this code is way over my head. I see it references the function voltage_resting_estimate(), but my github-fu is not very good and that is the end of my trail.

the blame game on Github says @rmackay9 and @Leonardthall added it - can you guys help me understand how BATT_FS_VOLTSRC works? with regards to the ‘sag compensated voltage’?


We estimate the impedance of the battery by looking at how the voltage sags as current increases. So every time the current goes up or down we improve our estimate of battery resistance.

We then calculate the sag compensated battery voltage based on the Measured_Voltage + Measured_Current x Resistance.

1 Like

Thanks! So is this a tool to mitigate false positive battery failsafe especially toward the end of the battery?

How does the battery handle that? Is it bad for the battery to draw deeper into its own sag even if it’s only for a few moments of maneuver at a time?

Is the impedance logged or can you query the impedance via mavlink? That would be a good way to monitor the aging of a battery from flight to flight and would make replacing a battery easier - before a failsafe occurs.

Regards Rolf

1 Like

I don’t see it available via mavlink; the BATTERY_STATUS message has these fields:
BATTERY_STATUS {id, battery_function, type, temperature, voltages[], current_battery, current_consumed, energy_consumed, battery_remaining, time_remaining, charge_state}

But it looks like the resistance estimate is logged in dataflash as BAT.Res

Thanks for your help and information rick.
BAT.Res correlates very well with charge cycles or undergone stress of the batteries.

Yes this is the resistance calculation but I do not believe we transmit this back by mavlink.

This is to accurately predict the resting voltage when high currents are being drawn from the battery. Without this the higher currents drop the battery voltage and register a low battery when it may actually only be partially depleted.


Is the Sag-Compensated voltage determination applicable to Li-Ion chemistry?

Yes, it is applicable to all chemistries. All batteries have an internal resistance.

Thanks @Leonardthall - yes, of course.

As different cell chemistries behave differently, I was curious if there were operational considerations to keep in mind.

Maybe you are expecting the resting voltage to be perfectly accurate. It isn’t. For example LiPo batteries seem to have what I call a “Chemical Sag” where under high load they can’t seem to keep up with the current demand and the voltage drops by more than the internal impedance can explain. When the current draw is suddenly stopped the voltage does not immediately return to the true resting voltage but climbs slowly (something like a second) back up to that level.

I am not a battery physicist/chemist so I don’t know much less understand how any individual chemistry creates behaviours or how we could account for them to improve our understanding of battery state.

However, internal impedance is very simple and always present. We can account for this and provide the user a better understanding of battery charge state and also improve our battery failsafe reliability. We do what we can do. We are always happy to hear ways we could further improve these things if you have that knowledge.

Thank you @Leonardthall - your comments are very helpful.

While I think I have pretty good current measurement using MAUCH current sensors, I always compare the recorded current totals with the current consumed in re-charge. There are losses as energy is lost in the discharge and re-charge process - so 1000mAh consumed in operations requires more than 1000mAh in recharge. But as I work to tailor battery failsafes that do not deplete batteries beyond safe levels - nor leave energy “on the table” (so to speak) - the better I understand the topic, the more successful my efforts may be.

In working with Li-Ion cells, I’ve become more focused on watt-hour consumption rather than amperage. I’m wondering if watt-hours might be a better element for the battery “reserve” failsafe parameter - rather than amperage.

I’m fortunate - as are all ArduPilot operators - the DEVs have packed the software with so many recorded data elements.

Thank you,

Yes, I too have went though this process with our Callisto 50 Aircraft. The way I picture this problem is there are only so many atoms in the battery and therefore only a limited amount of electrons that can flow before the battery is depleted. This is directly related to current through the system, not the voltage.

Power consumed is current x voltage so the internal voltage drop due to the internal impedance or other chemical reaction rate limitations can drop the measured voltage and therefore the consumed power.

For example, if you short the output of your battery terminals you will measure zero watt-hours hours consumed because you have zero voltage (shorted outputs). However the same total current will flow before the battery is discharged. (assuming some imaginary battery that doesn’t explode during the experiment). This is because the battery will dissipate all this power into the internal impedance of the cells and wires.

So my approach is to use total current to measure the absolute depletion level of the batter under the assumption that it is always fully charged at start up. Then I rely on the resting voltage level failsafe to catch those cases when the batter was not fully charged and the battery gets close to fully discharged.

So I generally set the current consumption limit to about 30% remaining and the battery failsafe voltage level to 20% remaining. However I have noticed that the resting voltage drops at a given charge state as the battery ages. So I find I have to drop the resting voltage by at least half a volt for older batteries. (this is also why I don’t think we will ever be able to predict battery charge state from voltage at start up)

Really annoying!

Then you can start talking about the stuff I added to limit current draw from the battery to prevent damaging the battery with prolonged high current draw. The current limit is perfect for this as the power dissipated in the battery is the current x internal impedance. However, if we want to prevent damage to the motors then we need to also add a power limit. This is because the power being dissipated in the motors is going to be directly proportional to the Power, not the current. This is because as the battery voltage drops the ESC’s draw more current to maintain the same power output to the motors. The motors don’t see the battery voltage at all.

Thank you.

Do I understand correctly from previous comments/threads that AruduPilot calculates battery circuit impedance - it doesn’t have the ability to measure it directly?

If this is true - that’s why I was thinking that watt-hour consumption might be a helpful measure of battery capacity or depletion.

Some months ago when I spoke with someone at ZipLine about about this topic, she mentioned measuring battery usage using what she called a “coulomb counter.” That agrees with your thinking of there only being so many electrons to move from anode to cathode. Our task is to count them.

As far as I know, the only way to measure resistance is by measuring both voltage and current then doing the calculation. I guess I am not sure what you are asking there. It is true that the internal impedance we calculate includes the wiring to the voltage measurement point. Power is measured the same way.

I am not sure how this would change what I stated above about current vs power consumption.

Current is Coulomb’s per second so current is the direct measurement of the number of Coulombs we pass each second. So the integrated current is the total number of Coulombs we have passed. So I think you summed up what I said perfectly above.