Integration of ArduPilot and VIO tracking camera (Part 3): Indoor autonomous flights and performance tests

Introduction

Following part 1 and part 2, we now have a quadcopter capable of precise indoor localization and navigation using position data provided the Intel Realsense T265. In this last part of the ROS-based portion of my ongoing series of labs, we will take a look at how to use Mission Planner or python program to send waypoints to ArduPilot, perform autonomous flights, and verify the tracking performance of the T265 under some challenging scenarios.

This blog is divided into the following sections:

  1. Prerequisite
  2. Checking the safety perimeter
  3. Autonomous flight
    a. Mission Planner - Sending waypoints in Guided mode
    b. ROS - Sending waypoints in Guided mode
  4. Performance tests
  5. Conclusions

1. Prerequisite

You need to follow the steps outlined in the previous part 1 and part 2, and have a vehicle that:

2. Checking the safety perimeter

This is an important safety measure for indoor flight tests. What we need to know is the safety area on the map that the vehicle can operate in, so that we don’t accidentally send it to the wall.

3. Autonomous flights

Once a safety working boundary is established, autonomous tests can be carried out. If you are using Mission Planner, you don’t need to set one of the flight modes to “Guided”, but that might not be the case for other GCS.

a) Mission Planner - Sending waypoints in Guided mode

  • Make sure position data feedback from T265 is ready, EKF home set, and you have a good sense of the safety boundary.
  • Manual flight test: take off in alt-hold or loiter. Make sure that the vehicle can hover stably.
  • In the Mission Planner map, right-click on a nearby spot and select “Fly to Here”. Select a safe altitude for guided mode so that you do not hit the ceiling.
  • The vehicle should fly to the target location and hover there until further commands or the flight mode changes.
  • Next flights, you can try and issue a take off command from Mission Planner directly.

b) ROS - Sending waypoints in Guided mode

We will use the scripts provided by @anbello in the aruco_gridboard repo. If you have cloned my repo vision_to_mavros, two of the examples are located in the scripts folder.

  • Make sure position data feedback from T265 is ready, EKF home set, and you have a good sense of the safety boundary.
  • Manual flight test: take off in alt-hold or loiter. Make sure that the vehicle can hover stably.
  • Before you run the script: if the quadcopter is on the ground, make sure to put it back to the origin, otherwise it will not take off vertically.
  • To run the scripts:

# On RPi, run either:

rosrun vision_to_mavros mavros_control1.py

# Or:

rosrun vision_to_mavros mavros_control1.py

  • The first script mavros_control1.py will ask the quadcopter to take off to 1.2m, maintain that height while moving along a 0.4m square.
  • The second script mavros_control2.py will ask the quadcopter to take off to 1m, move around a circle of 1m radius, while lock the heading to look at the center of the circle.

Now you can explore other options for autonomous flights. Here’s some references to help you get started:

4. Performance tests

One of the key aspects of my 2019 GSoC project is to analyze the performance of the T265 under various conditions. First, let’s see the results of a good flight:

To test the limits of the T265, I have conducted several tests in an indoor environment:

  • High dynamic range of luminosity:

In the video, you can see that the tracking “confidence level” of the T265 was “high” for most of the flight and briefly changed to “medium” when the light switched between on/off. The T265 was able to maintain good tracking even when the lights were turned off, or turned of and off repeatedly. Albeit that the room was not completely dark, I believe not many SLAM algorithms would have coped that well with such lighting conditions.

  • Frame induced with heavy vibration:

My quadcopter frame did not have much vibration (luckily). To test this, vibration was intentionally added into the system by changing one motor to a different KV than the others, and wrapping duct tapes to the tips of one propeller. Even then, there were no problems with for most of the tests.

However, as can be seen in the video below, position tracking did diverge in some cases, suggesting that with “appropriate” kind of vibrations, problems can occur. The EKF2 was fusing both vision feedback and optical flow. At the end of the video, when position feedback “jumped” back to the right position, the local position did not jump and remained approximately where it was.

  • Environment with a lack of distinctive features and repetitive patterns:

To simulate this scenario, the T265 is positioned facing downward. In this case, the scale of the T265 is often not as accurate as facing forward (i.e. 1m lateral movement translates to 1.2 or 0.8 difference in position feedback). Even though optical flow was also fused, local position shown on Mission Planner was similar to that of T265 feedback. Occasionally, the position also drifted after some rounds of movement.

It is safe to say that as of this writing, care should be taken when using the T265 in a downfacing configuration, specifically in terms of scale, initial yaw angle, and drift:

5. Conclusions

With this, we can now close our first section of the lab series, which focuses on the integration of ArduPilot and the Intel RealSense Tracking Camera T265. I hope that these 3 blog posts will be a getting started guide for anyone interested in the use of visual inertial odometry (VIO) technology to make a robot that can navigate in GPS-less environment, using a plug-and-play tracking camera.

  1. Part 1
  2. Part 2
  3. Part 3 (this blog)

Any feedback is welcome and I look forward to hearing from you all.

11 Likes

Thanks for your ‘‘steady delivery’’ of consistent and detailed Labs.
We can appreciate the progress and discover the features on a weekly basis.
With this 4th lab we conclude the ‘‘academic’’ part of the project with a good set of tools and techniques that will be valuables for the next phases.

1 Like

Really great detail.

By the way, I removed what I think was a copy-paste issue at the very top of the blog post. I think the video is appearing now correctly but please feel free to correct the blog if it’s not quite what you wanted.

1 Like

very nice !
About the 20cm delta with Rangefinder, isn’t that the offset of the ground distance offset for the rangefinder ?

Thank you @khancyr for your comment!
Indeed the 20cm is the offset from ground distance for the rangefinder. The Realsense T265, on the other hand, uses the starting point as the origin.

@rmackay9 that was in fact a copy-paste error in my original post. Thanks for the edit!

Hi @LuckyBird

Which rangefinder are you using?

Thanks

Hi @Nilantha_Saluwadana, so sorry for the late reply. If you are still interested, I used the teraranger one on the copter.

1 Like

@LuckyBird ,does drone avoided obstacles in same setup you tested ???

I believe you also ask the same question on this post? Integration of ArduPilot and VIO tracking camera (Part 2): Complete installation and indoor non-GPS flights
Keep the discussion in one place would be easier for others to follow.

Hello LuckyBird

Could the VIO tracking camera be used for path recording and a sort of teach and repeat for drones?

Thanks. Regards.

Andrea

Hi Andrea,

The VIO tracking camera itself does not provide those features. Here’s a detailed description of what it can and cannot do.
Of course, you can use it as a localization and raw data sensor for the software that can provide the things you are looking for, for example this Teach-Repeat-Replan work from HKUST.

Hope this helps.

Hi @LuckyBird I’m new to this and I’m currently working on flying the drone indoor autonomously. As of now, the intel realsense T265 looks like a good alternative and I’m planning to get this to start working with my project. I have a pixhawk 4 flight controller and raspberry pi 4 as my other pieces of hardware. Any piece of advise? My goal is to fly the drone indoors autonomously to achieve inventory scanning inside a warehouse.

Hi @gauravshukla914, sorry for the late reply. I suggest to start with Randy’s detailed blog post that is almost exactly the same setup as yours: Easier setup for Intel RealSense T265

Thanks @LuckyBird. I’ll check that. But I could still use these scripts right of python? I mean sending waypoints through python?

Yes, the other parts should work fine. You might want to separate the waypoints and other parts when testing to make sure each part works.

Hi @LuckyBird . Another question - I have currently installed the latest apsync image provided by Randy i.e Ubuntu 18.04.4 LTS. For the installation of librealsense, its redirecting me to the installation page of librealsense on Ubuntu 16.04. Will these installation steps work for Ubunti 18.04 as well?

I believe the installation on Ubuntu 16.04 and 18.04 differs mostly on the dependencies etc. Anyway, following the official instructions is always recommended: https://github.com/IntelRealSense/librealsense/blob/master/doc/installation.md

@LuckyBird I believe, the official instruction is not for RPI running ubuntu. Currently, I have an RPI 4 running ubuntu 18.04.4 (latest apsync image provided by Randy).

I see. In that case I can only say you need to try them out and see what actually works. And if you don’t mind, comment your experience on @rmackay9’s blog post for other to follow up on would be greatly appreciated.

Hello,

I’m using python+dronekit with GUIDED_NOGPS mode to control the FCU over MavLink.
I now have the T265 and would like to add it so I get better stability. With the T265 installed, Can I switch to GUIDED mode and tell the Ardupilot to use VIO while I sent it movement commands over dronekit/MavLink?

Thanks,
Raz