Navigation through unfamiliar spaces while avoiding obstacles is always of utmost importance for any autonomous robots. In many applications for ground and aerial robots, there is a growing need for accurate navigation and localization in areas where GPS is not accessible. One such technology that has the capability to provide accurate, robust GPS-less localization and navigation is visual inertial odometry (VIO), which combines information from a camera and an inertial sensor to tell the drone where it is in relation to its environment.
In this proposal, I would like to contribute new functionalities to the ArduPilot codebase to better utilize the Intel® RealSense™ Tracking Camera T265 for accurate localization and navigation, hence freeing up resources for the companion computer to perform other high-level tasks, as well as documentation with step-by-step hardware and software integration procedure for real-life experiments, so that anybody can follow and even more amazing applications can be developed in the future.
2. New features for the codebase
Below is the list of new features that I wish to contribute to the ArduPilot codebase in this project.
A. Add/ improve support for ArduPilot to incorporate VIO data for external navigation (ROS and non-ROS).
Most open-source VIO algorithms are implemented in ROS (Robot Operating System), a standard framework for robotic researches in academia. However, with ROS there might be quite a steep learning curve for anyone to dive into before any real tests can be done. In this project, a general framework will be implemented, in ROS as well as non-ROS environment, that let ArduPilot incorporate VIO data for precision localization and navigation.
Moreover, most of the current support in ArduPilot is implemented in EKF2. Adding support for EKF3 would prove to be of great benefit for future development. The key idea for this part is to implement the same methods and interfaces currently available in EKF2 to EKF3, before adding new features in both EKFs (if it ain’t broke, don’t fix it).
B. Add support for different VIO camera orientation setups.
VIO algorithms can only achieve robust performance if the camera can see a surface with as many distinctive features as possible. For that reason, different operating scenarios will only permit certain camera orientations on the UAV frame. For example, a UAV flying indoor and low altitude will likely need a forward-facing camera, while flying outdoor at high altitude will only allow the camera to face downward.
With different orientation angles, the data from VIO camera will require a transformation for the frames to be aligned before the position data can be used by ArduPilot. Frames alignment method and associated parametrization will be developed, thus simplify the setup process and make the system more flexible.
C. Hardware + software setup guide for a quadcopter platform.
A step-by-step guide would help everyone to be able to replicate and validate the code quickly. The setup guide will include:
Hardware setup: quadcopter platform with ArduPilot + VIO tracking camera (Intel Realsense T265) + companion computer.
Software setup: ROS and non-ROS, Realsense SDK (on companion computer) + ArduPilot firmware.
Parameter configuration: on companion computer and ArduPilot to send and receive VISION_POSE_ESTIMATE messages and camera pose transformation.
Ground testing: how to monitor and plot data messages on GCS to ensure that everything is running according to plan.
Flight test: Explore the operational envelope in different modes (Loiter, Guided, Auto ) and environment
D. Experiments and flight test results with quadcopter.
Besides validating the stability in real flights (by visually observing the quad), data logs will also be analyzed to verify:
EKF2, EKF3 outputs, states and status data.
Fusing complementary states estimates as Optical Flow
VIO accuracy for z estimates IRT to RangeFinder
VIO operation under different light condition and vibration.
E. Add documentation for simple setup and testing.
The documents and reports that will be created along the way will be added to the wiki, specifically:
Blog posts on https://discuss.ardupilot.org/: report for progress as well as receiving feedback from the community to adjust and improve.
New wiki page “VIO for Non-GPS Navigation” (similar to Optitrack) which will include: hardware setup, software setup, configuration, ground testing, real flight.
3. Expecting things to go wrong
With a nice tool like the T265, it might be tempting to try out all sorts of things. However, always keep in mind that just like any other equipment, the T265 has its own limits and we should learn about them, anticipate when things can go wrong and prevent them from happening in the first place.
The T265’s underlying algorithm is susceptible to vibration (as is the case for most VIO/SLAM algorithms), which can lead to divergence or complete lost of the tracking position. Providing sufficient damping for the camera is therefore crucial to achieve robust and reliable performance. Additionally, adding wheel odometry or other sensors (optical flow, LiDAR, etc.) can help the T265 works nicer.
At longer distance, the output scale is reported to be off by 20-30% of the actual scale.
I am certainly not the only one interested in this kind of project. There have been numerous works in this area before and my proposal is made possible only thanks to their awesome contributions. Just to name a few:
For this project, I will have the privilege of working with @ppoirier and @rmackay9. Hopefully, at the end of this summer we will see the Realsense T265 tracking camera become a sort of plug-and-play “sensor” that anybody can get it up and running for their vehicle. All of the progress will be documented in the coming blog posts, so stay tuned! And as usual, your input is welcome and I look forward to hearing from you all.