Hi all! I use pixhawk 6x with copter-4.6.2 firmware on my drone.
Goal: To implement drone position hold over the ground using Computer Vision (CV) in the absence of a GPS signal.
Method: A companion computer with a camera estimates the drone’s position and sends it to the flight controller (FC) as a virtual GPS signal via the GPS_INPUT message over MAVLink (Ethernet).
Companion’s Operation Scheme:
- Option A (ArUco marker): Calculates the drone’s coordinates relative to a stationary marker with known real-world coordinates.
- Option B (Optical flow): Integrates frame-to-frame displacements to estimate position.
Problem: Stabilization in Loiter mode does not work. After one or two corrective iterations, the drone begins to fly away uncontrollably. At the same time, the CV algorithms and MAVLink communication have been tested and work correctly in Precision Loiter/Land modes when a real GPS is present.
Conducted Experiments and Their Results:
In all scenarios, takeoff was performed in Stabilize/AltHold, followed by a transition to Loiter. When the drone “wandered off,” it was recovered using manual control.
- Experiment with ArUco marker:
- Configuration: Two GPS receivers in the system.
- GPS1: Virtual, calculated from the marker (sent via MAVLink).
- GPS2: Physical sensor on the drone (used only for discrepancy monitoring).
- Result: The drone does not hold position, drifts away after a couple of iterations.
- log: https://drive.google.com/file/d/1cDjqb92v_nqWxsPeFu7DRC0NeDM1QL_8/view?usp=drive_link
- Experiment with Optical Flow:
- Configuration: Similar to experiment 1.
- GPS1: Virtual, calculated by integrating optical flow displacements.
- Result: Behavior similar to experiment 1.
- log: https://drive.google.com/file/d/1cXcU_2fz2Rf0si6RLjBUUewakupFlke0/view?usp=drive_link
- Control Experiment (Signal Chain Check):
- Configuration: Two GPS receivers.
- GPS1: Virtual, data (latitude, longitude, altitude) is directly copied from the telemetry of GPS2 and relayed back to the flight controller.
- Expectation: The system should work stably, as it essentially duplicates the real GPS signal.
- Result: Behavior is similar to the first two cases — stabilization does not work, the drone drifts away. This points to a problem not with the CV algorithms, but with the transmission chain or processing of the virtual GPS signal.
- log: https://drive.google.com/file/d/1Sx_dXluQZv40uLDdlEDIM9s9hogr_dvj/view?usp=drive_link
Conclusion from the experiments: The problem most likely lies not in the accuracy of the computer vision, but in the method of integrating the virtual GPS signal into the flight controller’s navigation loop. The control experiment (3) confirms this.
I wanted to understand if it is even possible in principle to solve my task using GPS_INPUT, or if ArduPilot fundamentally cannot process GPS_INPUT the same way it does with real GPS sensors. Thank you for any advice!