Thoughts on distance/altitude based intelligent battery failsafe

Years ago when I was just starting training for my pilot’s license, I went out to fuel up and preflight the aircraft while my CFI was inside doing some paperwork. I knew the prior renter on the plane had just got back from a long weekend trip and that it would probably need fuel. So I checked the tanks first so I could get the fuel order in.

I sumped the left tank and nothing came out. Figured he must have ran it really low before switching to the other tank. So I go sump the right tank figuring there must be plenty left in there. Nothing came out. Looked inside both wings with a flashlight, bone dry. Pulled the strainer drain and some fuel came out but not much and not under the usual head pressure it does. The only fuel left in this plane is what was in the lines and strainer.

If that idiot had so much as another 2 mph headwind or had to fly a wider pattern for traffic, they would have been fishing him out of the harbor.

There are a lot of good ideas and also some mis-information in this thread. Creating an accurate SOC estimator is non-trivial. I ran R&D for Zero Motorcycles for 5 years and it took many brilliant engineers years to get a “pretty good” SOC estimator. Here’s my 2 cents for an achievable and workable system.

There are essentially 2 problems you are tackling here.

  1. A State Of Charge estimator. This requires knowledge of the batteries actual capacity, health, discharge characteristics and battery temperature.

  2. An RTL energy estimator. This requires knowledge of the distance and alt to home, and the energy required to get there. You may need to consider, wind, altitude, ambient temp, etc…

For all this to work you MUST have accurate battery data. Be aware most hobby lipo will deteriorate at something like 10% / 100 cycles. Some batteries and duty cycles will degrade even faster.

Cold weather reduces voltage and total energy available from the battery. Again I would ballpark around 10% loss for a near freezing day.

This means, unless you’re going to enter accurate capacity values for every flight, for each battery, and for the current conditions, you need to “pad” your FS by 20% to handle a battery that is slightly older and running on a cold day.

If you build a FS like this, people are going to rely on it, whether that is your intention or not. And they are going to be mad when their vehicle fails to return as advertised. I’m not saying this is reasonable, only the reality.

My 2 cents. Just have the vehicle FS at 50% SOC, if it got that far away at 50% there is a good, though not guaranteed chance it will make it home. Anything less is just opening the door for failure unless you have done an extremely good job on the first two points.

All that stuff about hobby LiPos is already the case today. There is already a failsafe based on MAH capacity entered by the user. And all those factors already apply. So I don’t think this is introducing a radically new concept.

What’s been suggested is simply to use watts instead of MAH.

The proposal here was to actually measure it. Put a fully charged battery in your aircraft and fly it to the safe voltage cutoff. Let the system tell you how many watt-hours it used. That is your battery’s actual energy capacity.

It is actually quite easy to detect an aging battery if you do regular cell checks during charging. I would suspect anybody flying electrics is aware of that and if the cell IR is up to 20 or it’s coming in at significantly lower voltage or cells out of balance than it used to, it’s time to replace it for serious long-distance flight.

Yes, indeed - somebody just like the pilot that crashed a C-180 in my field will find a way around it. There’s somebody someplace working on finding a way around proper procedure and common sense every day. But that doesn’t mean we shouldn’t work on a better set of tools for the rest of us to keep better track of our aircraft’s performance capabilities.

And to combine it with energy calcs to get the aircraft home on time with fuel left.

To clarify for those that may not follow - mAh, or amp-hours is an arbitrary number. It has nothing to do with energy capacity of the battery, nor does it have a real-world application. It is a rating used by manufacturers to specify how many amps a battery can put out for a specific time at a specific discharge rate to a specific cell voltage

Watt-hours is an actual measurement of power over time. In DC power, amps x volts x time

Watt-hours for battery capacity is used almost exclusively in battery power systems used in off-grid, solar, battery backup, EV’s, etc. it improves accuracy in energy calculations because it takes into account both voltage and current (watts) over time.
http://www.goalzero.com/solarlife/2013/05/28/clearing-up-the-question-of-battery-capacity-in-electronics/

I want to expound on this too. It improves accuracy of the system. To measure mAh requires a precisely calibrated current sensor. And the amp-hour capacity of a battery changes with different discharge rates.

Using watt-hours does not depend on this.

If an end user flies his/her aircraft to voltage cutoff and lets the system measure the watt-hours to make the settings, the calibration of the voltage and current sensor being precise does not matter. Because the same calibration values used for the watt-hour test apply to all further energy calculations done by the system.

But this is still no different than what we’re doing now as far as battery capacity determination. The user hand keys in the capacity. How the user determines that capacity doesn’t really matter. Whether they blindly enter 5000mah because that’s what is printed on the battery, or they test it in flight and enter a real mah value, the user interface is no different. The user keys in the battery capacity. We can’t force the user to measure it in flight, and we also can’t assume they did. All we can do is give the user the opportunity to enter the information, which we already do. Today, it is measured in MAH. I would be adding a parameter for the user to enter the capacity in watts. Ergo, this is not a huge change in how the battery capacity is determined. Literally it’s changing the unit of measure. That’s all.

The estimation, notification, and failsafe drivers can work off of MAH or watts. Because again it is just a unit of measure. Watts is obviously more accurate. But the concept will work with either. Even the old capacity remaining failsafe can work the same way. Again just the unit of measure changing.

In other words, I think the description of what to accomplish here is being made way more complicated than what we’re actually accomplishing :slight_smile:

Right. But that measurement is only valid for that particular battery, at that duty cycle, at that ambient temperature, etc… Assuming you have more than 1 battery you will have to re-calibrate the system regularly with each battery, and if you forget or fail to do it correctly, for each battery, you have introduced potential for failure.

Also, just to mention it. While Ah are not a measure of energy, they are a measure of capacity, and is better than Wh for determining SOC as a %. Check out Coulomb Counting…

Without a good SOC estimator, that works well for all your batteries, and in every situation that you intend to use, an SOC based FS is asking for trouble. The only way DJI gets away with it, is they have a BMS in every battery that gauges cell health and actual capacity, and that is still less than perfect.

If you wanted to allow for ~10% margin in your SOC estimate, and you wanted to allow for ~10% in your RTL energy estimate, and you want to land with 10% reserves, you already need 30% SOC minimum. So might as well RTL at 50% and land with 30% reserves assuming all else is equal. To cut it any finer than that you need to provide additional information to the system which it does not have (SOC, cell health, temp, wind, air density, etc…) To anyone using the existing capacity based system to actually make it home alive, we perform all of these calculations mentally. How good is this battery, is it 100% charged, how far away will I possibly be, how strong are the winds, how cold is it, how fast does my vehicle cruise, what is cruise power, etc… Then we set an appropriate capacity reserve to overcome all potential challenges and make it home.

Allowing people to believe they can rely on a system which is essentially blind to the necessary information for these calculations is just asking for Murphy’s law to be applied in a vigorous manner.

Don’t get me wrong, I love the general concept, and I might even use it. But in the end it would only allow me to cut a slightly finer estimate than what I do mentally, and it does not obviate the need to understand all the finer details to use it properly, and incorrect useage is likely to cause more problems than it solves.

Maybe. I’ve been looking for a system that works with fuel helicopters, which is what I primarily fly, as well as electric. I have a timer on the radio that tells me how much time I got until the heli goes into autorotation. But it doesn’t do the time over distance calculations that would be nice to have - sort of like a RTL time that displays in a dropdown on the distance from home readout in Tower.

Tower already has a nice flight timer that agrees with my radio timer. And it downloads the RTL speed in the params prior to flight. Right there I got speed, distance and time. Could be done in the ground station software too and that’s what I’ve actually looked at. I’ve downloaded and installed the Android SDK intending to look at it at some point and see if I figure how to do it.

The only problem is, Ah cannot be used to determine flight time left. It is only a measure of amps over time. When the battery is fully charged less amp-hours are used per minute of flight because the voltage is higher. As voltage sags as the battery discharges, more amp-hours are used per minute. The watts to commence powered flight stays the same, regardless.

This is the standard in all EV’s. Look up the specs on the Tesla Model 3 - battery capacity 65kWh Li-ion. Optional 75kWh.

I fully understand, which is why I said “determining SOC as a %”. You cannot derive battery SOC % or remaining energy from Wh consumed except in very controlled situations. Instead, it is industry standard to calculate SOC as %, then apply that to the batteries assumed initial capacity at last 100% charge to determine remaining energy.

Just because EV manufacturers list the advertised capacity of the battery in kWh, does NOT mean that is the number they are using for SOC and range calculation. Because an EV’s battery capacity is NEVER that advertised figure, and changes up and down on a daily basis, though with an always down trend obviously :slight_smile: Please trust me when I tell you modified Coulomb Counting is the industry standard. Of course every manufacture closely guards their specific SOC algorithms.

For a specific example:

Let’s say we have a 6s 5000mah battery and at 70f ambient it can provide 108Wh down to an 18v (3v/cell) cutoff point. Let’s assume 1c discharge rate and 3.6v/cell nominal voltage. On a very cold day (40f) we might see a 1/10th (or more!) drop in nominal voltage. This means we only get 105Wh, almost 3% less energy. If you are using Wh to calculate your SOC, you will be almost 3% off. In the same scenario, the current goes up by 3%, but the Ah capacity remains nearly identical, and thus integrating Amps yields a still accurate SOC%.

To be clear, when a battery is colder, the voltage sags and the Wh goes down, but the available Ah capacity remains (very nearly) the same. The same applies when slamming a battery at higher crates, it’s the voltage sag that reduces total energy, not less Ah. (And yes, you do lose Ah capacity, but it is generally less of an effect than the Voltage sag). Ah capacity tends to be more of a cell health correlation. Voltage tends to be a more variable day-day fluctuation, Ah capacity tends to be more long term downward trend.

I don’t have any cell discharge curves on me, but there’s plenty on the interwebs to support this.

So if you want to set your FS by using SOC %, then you should use the most accurate SOC method you have available. If you want to set your FS using a consumed amount of energy, then you can do that also. But this does not guarantee a specific energy reserve. Both methods require characterization of the batteries capacity (Ah or Wh) in similar duty cycle to actual application.

Using the previous example of 1/10th volt drop due to cold temps:

Using Wh reserve method would yield a 3% less than intended reserves.

Using a SOC % reserve method would yield <1% less than intended reserves.

In extreme conditions or with a very unhealthy battery, these differences can become even more exaggerated.

Sorry, it’s late and I’m rambling, hope this makes sense. I suggest you google SOC estimation and Coulomb Counting, there’s lots of better info than what I’m providing here.

Guys,
development is an iterative process. It is impossible to make such a complex feature by a single try.

Estimating by SOC% could be a candidate for the future improvement. I believe there is no sence to debate a lot at the moment :slight_smile:

Estimating by watt-hours was choosen just because it ensures acceptable estimation accuracy. And it is relatively simple.

Another point was about how to manage estimator if you fly different batteries. I’ve created this ticket some time ago. Please share you thought :slight_smile:

My idea has always been to have a system like a regular FAA flight plan used by full-size pilots. For the non-pilots in the discussion, this is what one looks like. And most of this doesn’t apply, but note Box 10 enroute time, and Box 12 hours and minutes of fuel onboard

In the real world, pilots are required to know the fuel burn of their aircraft. If the tanks are full you know how many gallons you have. Or if they are partially full you stick 'em and refer the chart in the flight manual on how much you have, and calculate it. You NEVER rely on the gauges.

And enter these values into the flight plan in the ground station. From that point it would be pretty simple to set off an alarm for low fuel vs RTL time and distance, etc…

I think the problem comes in where RC pilots want to be more dependent on automation to do everything for them so they don’t really have to know what they’re doing. So if the aircraft runs out of fuel and crashes 1/4 mile from home it wasn’t the pilot’s fault - it was bad software.

For fuel RC pilots, we have to know how to do it because we don’t have any sort of PM measuring fuel flow and tank capacity. I think electric pilots should learn how to do it too, because it’s not that complicated. And the percentage of energy left in LiPo’s is pretty well known based on their at-rest voltage in a cool state if you take off with partially charged packs (like two subsequent flights without changing batteries).

You can try to automate everything but it becomes more complex. And the more complex it is, the more problems it has. So the problem is, can we expect the hoards of multi pilots to take responsibility for calculating the flight time of their aircraft for a flight plan? Probably not. DJI set the bar for the whole industry to where people who can’t fly a paper airplane without wrecking it are flying drones. And 99% of them probably don’t have a single clue how to calculate it even if you put the numbers in front of them and showed them how to do it. They want the drone to do it all. The sad part is that automation is not 100% reliable.

Most of the power modules I have seen use a time constant of about 2 seconds. Even though the analog outputs from the power module are frequently sampled (1Hz or faster), the information reported to the flight controller has a lot of low pass already present.

Would a smart battery solution help in all of this?

What information would the smart battery need to provide to the flight controller (via I2C most likely) ?

I am playing with the idea to develop the necessary electronics with a friend over winter to convert a regular battery into a smart battery… not sure yet if we will do so… thus looking to see how helpfull the project could be…

Christian

@apache405 I thought all of the current/voltage PM outputs are analog. But I haven’t investigated them a lot.
If the current sampled (either by PM or by FC) 1Hz or faster that should be OK, until you aren’t changing your throttle up and down much faster that sampling rate.

They are analog. The first filter the current signal encounters is purely analog in the form of a resistor-capacitor low pass filter.

The hardware signal flow is typically this:

Shunt----amplifier (with low pass)-----wire harness-----ADC (on the FC).

I am not sure what the precise flow is once the ADC makes a value available to software.

It should, or at least get to the 80% mark. Many of the fuel gauge data items can be accessed by extending the smart battery driver to be a full implementation of the SMBus Smart Battery Specification. For any missing items, adding a fuel gauge system, a specific call would be suitable/reasonable.

Also, I have drawings for doing the hardware side of a dumb to smart battery conversion kit; however, I don’t have the user facing configuration software problem handled and asking the end user to figure out bqStudio (TI’s tool for smart battery/fuel gauge development) seems like a high-risk idea to me.

1 Like

This won’t work on multi’s, but on helicopters there is no decent power modules that can handle 50V and 10kW power peaks. I figured out how to get the telemetry data out of a Castle Phoenix ICE2 HV-series helicopter ESC. It has built-in data logging - snippet of what it logs taken from my Synergy 626

I can get power readings, and including ESC operating temp and headspeed to the RC radio telemetry, and is the direction I’m going. Trying to do “smart batteries” and/or having the software do everything just muddies the code, which is complicated enough already. And introduces more points of failure. All I need is the gauges to give me the information.

@ChrisOlson support for CastleLiveLink would be a great feature!

I will note that the current monitors on ICE/Edge controls from logging are incredibly inaccurate at low current ranges, (ie <10 Amps) Although this is probably partially related to the sampling frequency and the controller not having enough logging space for storing full flight logs.

It’s still worthwhile though as it’s data that is freely available if you are using that family of controllers, and you get fundamental temp and RPM monitoring which is also valuable.