Hi everyone,
I wanted to share some recent field test results from our “BO1 Quester” project. Our goal was to build a reliable inspection drone capable of operating in GPS-denied, unstructured environments (like dense forests) where standard VIO often fails due to lighting changes or textureless surfaces.
We implemented a Lidar-Visual-Inertial fusion running on a Companion Computer, which handles the state estimation and sends position setpoints to the flight controller.
Here is how the system performs in four extreme scenarios. I’d love to hear your thoughts on this approach compared to running native EKF3 optical flow setups!
1. The “Stress Test”: Autonomous Flight in Dense Forest
Flying in a forest is tricky because of the thin branches and dynamic lighting shadows. We tuned the fusion to trust Lidar point clouds more when visual features become unstable.
2. Indoor-to-Outdoor Transition & Precision Docking
Transitioning from a structured indoor environment to an open outdoor space usually causes GPS glitches. Here, we demonstrate a seamless transition and a precision landing into an automatic charging dock.
3. Low-Light Operations (Night Inspection)
Visual SLAM struggles at night. We tested the system at 7:18 PM to verify if the Lidar integration could compensate for the lack of visual features.
4. Full Workflow Demo
A complete look at an automated mission, showing the stability of the entire system from takeoff to task execution.
Discussion: We are currently processing the fusion on a high-performance companion board. For those using ArduPilot’s EKF3, have you found success blending 3D Lidar SLAM directly? Or do you prefer keeping the heavy SLAM processing on a separate companion computer like we do?