Servers by jDrones

Integration of ArduPilot and VIO tracking camera (Part 3): Indoor autonomous flights and performance tests

(Thien Nguyen) #1

Introduction

Following part 1 and part 2, we now have a quadcopter capable of precise indoor localization and navigation using position data provided the Intel Realsense T265. In this last part of the ROS-based portion of my ongoing series of labs, we will take a look at how to use Mission Planner or python program to send waypoints to ArduPilot, perform autonomous flights, and verify the tracking performance of the T265 under some challenging scenarios.

This blog is divided into the following sections:

  1. Prerequisite
  2. Checking the safety perimeter
  3. Autonomous flight
    a. Mission Planner - Sending waypoints in Guided mode
    b. ROS - Sending waypoints in Guided mode
  4. Performance tests
  5. Conclusions

1. Prerequisite

You need to follow the steps outlined in the previous part 1 and part 2, and have a vehicle that:

2. Checking the safety perimeter

This is an important safety measure for indoor flight tests. What we need to know is the safety area on the map that the vehicle can operate in, so that we don’t accidentally send it to the wall.

3. Autonomous flights

Once a safety working boundary is established, autonomous tests can be carried out. If you are using Mission Planner, you don’t need to set one of the flight modes to “Guided”, but that might not be the case for other GCS.

a) Mission Planner - Sending waypoints in Guided mode

  • Make sure position data feedback from T265 is ready, EKF home set, and you have a good sense of the safety boundary.
  • Manual flight test: take off in alt-hold or loiter. Make sure that the vehicle can hover stably.
  • In the Mission Planner map, right-click on a nearby spot and select “Fly to Here”. Select a safe altitude for guided mode so that you do not hit the ceiling.
  • The vehicle should fly to the target location and hover there until further commands or the flight mode changes.
  • Next flights, you can try and issue a take off command from Mission Planner directly.

b) ROS - Sending waypoints in Guided mode

We will use the scripts provided by @anbello in the aruco_gridboard repo. If you have cloned my repo vision_to_mavros, two of the examples are located in the scripts folder.

  • Make sure position data feedback from T265 is ready, EKF home set, and you have a good sense of the safety boundary.
  • Manual flight test: take off in alt-hold or loiter. Make sure that the vehicle can hover stably.
  • Before you run the script: if the quadcopter is on the ground, make sure to put it back to the origin, otherwise it will not take off vertically.
  • To run the scripts:

# On RPi, run either:

rosrun vision_to_mavros mavros_control1.py

# Or:

rosrun vision_to_mavros mavros_control1.py

  • The first script mavros_control1.py will ask the quadcopter to take off to 1.2m, maintain that height while moving along a 0.4m square.
  • The second script mavros_control2.py will ask the quadcopter to take off to 1m, move around a circle of 1m radius, while lock the heading to look at the center of the circle.

Now you can explore other options for autonomous flights. Here’s some references to help you get started:

4. Performance tests

One of the key aspects of my 2019 GSoC project is to analyze the performance of the T265 under various conditions. First, let’s see the results of a good flight:

To test the limits of the T265, I have conducted several tests in an indoor environment:

  • High dynamic range of luminosity:

In the video, you can see that the tracking “confidence level” of the T265 was “high” for most of the flight and briefly changed to “medium” when the light switched between on/off. The T265 was able to maintain good tracking even when the lights were turned off, or turned of and off repeatedly. Albeit that the room was not completely dark, I believe not many SLAM algorithms would have coped that well with such lighting conditions.

  • Frame induced with heavy vibration:

My quadcopter frame did not have much vibration (luckily). To test this, vibration was intentionally added into the system by changing one motor to a different KV than the others, and wrapping duct tapes to the tips of one propeller. Even then, there were no problems with for most of the tests.

However, as can be seen in the video below, position tracking did diverge in some cases, suggesting that with “appropriate” kind of vibrations, problems can occur. The EKF2 was fusing both vision feedback and optical flow. At the end of the video, when position feedback “jumped” back to the right position, the local position did not jump and remained approximately where it was.

  • Environment with a lack of distinctive features and repetitive patterns:

To simulate this scenario, the T265 is positioned facing downward. In this case, the scale of the T265 is often not as accurate as facing forward (i.e. 1m lateral movement translates to 1.2 or 0.8 difference in position feedback). Even though optical flow was also fused, local position shown on Mission Planner was similar to that of T265 feedback. Occasionally, the position also drifted after some rounds of movement.

It is safe to say that as of this writing, care should be taken when using the T265 in a downfacing configuration, specifically in terms of scale, initial yaw angle, and drift:

5. Conclusions

With this, we can now close our first section of the lab series, which focuses on the integration of ArduPilot and the Intel RealSense Tracking Camera T265. I hope that these 3 blog posts will be a getting started guide for anyone interested in the use of visual inertial odometry (VIO) technology to make a robot that can navigate in GPS-less environment, using a plug-and-play tracking camera.

  1. Part 1
  2. Part 2
  3. Part 3 (this blog)

Any feedback is welcome and I look forward to hearing from you all.

7 Likes
Does ArduPilot have a "Follow Me" autonomous mode?
(ppoirier) #2

Thanks for your ‘‘steady delivery’’ of consistent and detailed Labs.
We can appreciate the progress and discover the features on a weekly basis.
With this 4th lab we conclude the ‘‘academic’’ part of the project with a good set of tools and techniques that will be valuables for the next phases.

1 Like
(rmackay9) #3

Really great detail.

By the way, I removed what I think was a copy-paste issue at the very top of the blog post. I think the video is appearing now correctly but please feel free to correct the blog if it’s not quite what you wanted.

1 Like
(Khancyr) #4

very nice !
About the 20cm delta with Rangefinder, isn’t that the offset of the ground distance offset for the rangefinder ?

(Thien Nguyen) #5

Thank you @khancyr for your comment!
Indeed the 20cm is the offset from ground distance for the rangefinder. The Realsense T265, on the other hand, uses the starting point as the origin.

@rmackay9 that was in fact a copy-paste error in my original post. Thanks for the edit!