We’ve been facing an issue with the power monitor module used in our drone for current and voltage sensing. We are using Radiolink Power Monitor module with ArduCopter V4.1.5 (1002fd6e). The problem lies in the calibration, and we’ve observed inconsistent readings between ground tests and in-flight performance.
Here’s the situation:
Ground Calibration:
We calibrate the module at 16V with a load of 1.7 ohms.
We validate it by testing with 2.5 ohms and 5 ohms, and the amp/volt factor is around 14, giving accurate current readings on the ground.
In-Flight Behavior:
While flying, the amp/volt factor needs to be adjusted to values like 35, 38, 40, 45, or even 50 for accurate readings.
This means we have to recalibrate the module every time before a flight, which is far from ideal.
We suspect the issue could be related to factors like electromagnetic interference, vibrations during flight, temperature variations, or maybe some limitation in the module’s firmware or hardware.
Has anyone else encountered a similar issue with power modules or current sensors? If yes, how did you resolve it? Any troubleshooting tips or recommendations would be greatly appreciated.
Looking forward to your suggestions!
This is the Amp per Volt when we calibrated Radiolink power module on ground:
we have used generic power module before this, that was showing average cuurent as 23 A, then we used radiolink power module, it was inaccurate with the current sensing, so when we adjusted the BATT_AMP_PERVOLT then we got the required current.
This is the current measured using generic power module:
I don’t know how good the Radiolink product is but in general I doubt the method of calibration. Calibration must be carried out with a realistic current that corresponds at least to the average current of the system, so I consider a load of 1.7 ohms to be borderline, 2.5 ohms or even 5 ohms is too low a load. And during the load, the exact current and the measuring voltage must be measured at the same time. The purely calculated value from the battery voltage and resistance value is not sufficient.
The calibration can be determined better over a longer test period using the following method:
Set the BATT_AMP_PERVLT to the default value.
Load your Battery to 100%
Make a logged test flight / drive. The battery should be going down a significant amount (eg 50%).
Reload your Battery to 100%
Calculate New BATT_AMP_VOLT = Default BATT_AMP_VOLT * Recharged MAH / Logged MAH
The internal resistance of the battery changes, making the difference between resting voltage (voltage measured with no load) and voltage under load (voltage measured when current is flowing) very different. This is simply due to Ohm’s law. And the variable internal resistance makes it very hard (impossible) to get this right using constant parameters.
The battery internal resistance also changes from battery to battery and increases as the battery ages.
Ardupilot has the possibility to measure the current and estimate the resting voltage even when current is drawn from the battery. Its dynamically estimates the varying battery internal resistance. This gives you a consistent voltage measurement as if there was no internal resistance at all.
Needless to say that ArduPilot Methodic Calculator software configures the vehicle to use this method, and had you used it you would never had had this issue in the first place.