More accurate battery monitoring in Watt.hour

As the voltage of a battery goes down, the current increases, since the flyer needs the same number of Watts to maintain flight regardless of the voltage of the battery. Because of this, calculating the remaining battery percent based only on mAH is not as accurate as one based on Watt/Hours. With the current arrangement, the last 50% is not as “big” as the first 50%. Performing the calculation based on Watt/Hours is is especially useful for those using batteries made from 21700 cells, since the cutoff voltage is about 60% of the fully-charged voltage. Has anyone else proposed that the battery usage be calculated based on Watt/Hours instead of mAH?

You definitely meant Watt-Hour rather than Watt/Hour

Currently the convention in this community is to use SoC (ideally measured from mAh used) and battery packs are measured by the number of cells in series + charge capacity.

After thinking over a bit, this is probably a good idea. This would mean we the top end x Wh will give almost same flight time as the bottom end x Wh with the discrepancy being the inefficiencies due to more losses at the bottom end. This is certainly worse when using mAh as a comparison.
Now I don’t think there is much to gain for samll drones using Wh, but it might be useful for larger drones.

1 Like

Well, mAhs are printed on the battery packs as well as reported by the chargers, so you can check your battery monitor settings and consistency. Wh needs some maths.

1 Like

You are correct. I do mean Watt-Hours. I’m embarrassed because I’m an EE.

I fly some fairly large flyers with long (45min +) flight times. Both planes and quads. Because of the Watt-Hour problem, setting RTL at 55% of battery capacity is a recipe for disaster. The problem also gives a false sense of security. “I landed with 59% battery left, so I can go twice as far”. No you can’t!

The new batteries based on 18650s or 21700’s make the problem worse, since you can fly down to 2.85V/Cell (or so), and there is less of a “shelf” in the discharge curve compared with LiPOs.

I considered building my own battery monitor that would calculate Watt-Hours and give a (false) current reading in order to fool the FC into giving a true “percentage left” number. That would work, but only for those systems that had separate current monitors, not the 4-in-1’s or the “wing” type controllers.

This COULD also be done by MP or QGC, but that would be a hack and wouldn’t help the controller decide when it was time to fly home.

1 Like

Maybe it’s possible to write a script?

But can a script change what is sent over MAVLINK?

Didnt understand what you want to change

I want to change the way the FC calculates battery usage. By measuring Watt-Hours instead of milli Amp-Hours used the calculation would be more accurate. The FC could use this information to better determine when it was time to RTL and the operator could better determine how much power was left.

A script onboard the drone (assuming a capable controller) could calculate the wattage based on the current and voltage sensors. A script can then send the data back to the user as messages, and/or trigger RTL or whatever else you want.

I understand all that. But my goal is to have the FC send the accurate data using MAVLINK. Is there a way to add data calculated by a script to the data sent by the FC so that it appears along with rest of the data and shows on my MP screen?

Hi @charles_linquist,

We have a BattEstimate.lua script that might be a good starting point.

Re building your own battery, Huibean (VimDrones) and I are working on a smart battery for my aerial photography copter with the hardest part being the open source BMS using AP_Periph (hardware definition is here). I don’t think the VimDrones BMS board itself (e.g. the hardware) is open source but the software is.

I’ll properly announce the BMS in the coming months once it’s more fully functional

Assuming a well configured battery monitor, sag compensated voltage might be a very good (best?) metric to monitor and is already available.

Battery temperature also effects the voltage, which then effects the power-delivered per amp-delivered. The most accurate battery monitors (like in an electric car) use “coulomb counting” to assess the charge that has been stored and the charge that has been delivered, then a SoC/voltage curve with temp correction to calculate available energy remaining in the battery.That’s a lot to do in the autopilot- usually you would have a dedicated BMS to do that. It would be worthwhile in a larger vehicle with a long run time, especially if you have swappable battery packs. You could just have the BMS report the remaining capacity as external telemetry (most can do CAN or simple serial).

1 Like

Yes, especially useful to compare different battery packs of you swap in a new one. Yeah, EVs use kWh for a reason. It is like the measure of fuel in a CV.

I played a bit with the idea writing a simulation of my helicopter’s battery.

I tested two estimates of Wh used, “at current sensor” (for now I assuemed negligible losses behind the sensor) and “at voltage source” (sag compensated). Both are subject to inaccurate capacity estimate. The issue is made bigger by the fact that unlike capacity in Ah, Wh isn’t provided by the charger and even if it was it would need to account for charging lossses.

The other source of inaccuracy is prediction of power consumption/losses. Wh at sensor is very accurate if power consumption and exact capacity after losses is known, however if internal resistance changes the prediction becomes inaccurate (in my model it is ~constant 24s for 1385s flight which is 1.7% overestimation at ~25% increase in internal resistance and ~6.4% at ~100% resistance increase. The big issue is that it does not converge to 0 on depletion.

The other option is counting Wh at voltage source, this method is resistant to changes in battery’s internal resistance but makes predicting power draw harder, as losses varry a bit with voltage (due to internal resistance changes).

Ideal case (fresh calibration)


Increased resistance from 0,15 to 0,25 for most of the hover

Dynamic flight (decreased power 32W → 29W during 60 - 900s)

Wh (comp) - energy consumed was calculated at ideal source terminals
Wh (sens) - energy consumed was calculated at power monitor

current loss is current at measurement time squared times battery resistance
expected loss is expected current at minV squared times battery resistance

Additional simulations indicate that using harmonic mean of current loss and loss at V_min should provide pretty good, slightly conservative estimate of hover time remaining.

Next I will test with data from a real flight.

2 Likes

Just nitpicking here, just rename “current loss” to “power loss”. Was confused first.

Current refers to time in this case as the loss is based on resistance and current at the time of calculation.

Thanks for the work. I built a uC based battery tester that seems to be really accurate. It calculates capacity and measures ESR three times during discharge. It does not connect to the balance terminals. It measures voltage without load, applies a “1 C” load and measures voltage again. From those numbers, it calculates the ESR and uses actual cell voltage to determine the cutoff (stop the testing) voltage. But the capacity measurement is done at the terminals.

I^2.R is power (Watt)

I thought of using an external uC that would give calculated “fake” current readings to the FC in order to get more accurate readings. My external processor could calculate ESR, but it wouldn’t be aware if I changed battery sizes, a parameter that I can and do change often.