i did set it to 17.9 but i do not think it is even remotely correct. site says a 1/10th mV/A is 179.
what should be the arducopter value then?
have anybody measured an actual consumption using this PDB? with 17.9 it says my 6 motors on a tarot use 5A to loiter, and while it is flattering, but, it is not true.
1/10th of a millivolt is a strange way to specify that, but try using 179 instead of 17.9
I’d have to try one and measure it or see the circuit diagram to know more.
Anyway, the best way to set the current scale is hover for a while to use up a good proportion of a battery pack, land and immediately charge the battery, note how much mah the charger put back in.
Examine logs and see what AC thought you used at 179 Amps/volt. New Amps/volt = old amps/volt x charger mah ÷ logged mah
You can fine tune this over several flights, and this method works really well when you don’t have a 100 amp current meter that weighs nothing
This is the exact method I use to calibrate current sensors, and in 2 or 3 flights you can easily get to within ± 50mAh easy of total current consumed.
It’s a simple ratiometric calculation of the amp/volt factor based on the mah you put back in the battery on charge compared to the total mah shown in the flight log.
AFAIK everyone has been re-posting the same information found here from 2014. Not rocket science but before this I was using a current meter and running the props for upward thrust. A head slapper for sure
they simplified it a bit - there is also a part where you need to get a correct value of the BATT_AMP_OFFSET. for that you need an ampermeter, to match metered idle consumption - perhaps with a minor consumer like a VTX to the measured current in the MP screen.
process is iterative, as it also requires appx correct value of the BATT_AMP_PERVLT
Sure, but all I’m interested in is a good approximation of mah consumed. I have been using mah consumed to 85% pack capacity with this method for years and it hasn’t let me down once.
Same here. I’ve been using ScottFly’s method for a very long time, but I only go to 80% discharged. After a couple on iterations you can get discharged vs charged numbers to well within ± 100mAh.
Thanks for the information, I’ve been using this method for years and It worked well until last week when I changed my charger. I’ve worked with a Turnigy Reaktor and changed to an ISDT, then my mAh logged started to report 30% to 40% more mAh discharged than what my new charger reports, so my question is, how do you know which was correct, the logged readings, which were 5% to 10% off with my old charger or the mAh reported by my new charger?