Servers by jDrones

Integration of ArduPilot and VIO tracking camera (Part 2): Complete installation and indoor non-GPS flights

gsoc
gsoc19
t265
(Thien Nguyen) #1

Introduction

Continue from part 1 and as part of the ongoing labs, this time we will:

  • Have some discussion on how the system works, what and why some extra steps are required.
  • Finish installing all necessary packages to use the Intel Realsense T265 with ArduPilot in ROS.
  • Verify that everything is working in ground test, then comes actual flight test.

This blog is divided into the following sections:

  1. Prerequisite
  2. System overview
  3. ArduPilot parameters
  4. ROS packages installation
  5. Workflow process
  6. Ground test and flight test
  7. Conclusions and next steps

Let’s get started.

1. Prerequisite

Suppose you have followed the steps outlined in the 1st blog, your vehicle should have:

The steps below should work out-of-the box with a T265 camera facing forward. Other camera orientations can be used with appropriate modification of launch parameters. See section 4 for more details.

2. System overview

In this section, I will attempt to explain the overall system and the parts that are most subject to change, from a technical standpoint. This is not by any means an exhaustive explanation of how everything works, so if you are keen to learn more, additional references can be found at the end of this blog.

Let’s break the system down into more manageable units, from hardware to software:

  • Physical connection:

  • Data flow: The green boxes are ROS nodes.

Using mavros, there are a number of topics that can be used to transfer external navigation data to ArduPilot, with the following have been proven to work with experiments:

Name Main difference
VISION_POSITION_ESTIMATE Accept euler angles (roll, pitch, yaw)
ATT_POS_MOCAP Accept quaternions

We will convert the position data contained within /tf and send to mavros via the topic /mavros/vision_pose/pose. Mavros will take care of the ENU - NED frames transformation and send it to ArduPilot through MAVLink.

Our approach is made possible only thanks to the awesome contributions by the members of the community. Just to name a few:

Experiment with Visual Odometry - ROVIO part 1 and part 2 by @ppoirier

Indoor flight with external navigation data by @chobitsfan

Successfully using motion capture system as indoor GPS by @chobitsfan

Indoor autonomous flight with Arducopter, ROS and Aruco Boards Detection by @anbello

  • Reference frames: The body {B} and world {W} frames used by librealsense, ROS and ArduPilot are all different. realsense-ros provides position feedback in ENU frame, however different camera orientations and frame alignments need to be taken into account before sending pose data to mavros, which are done by vision_to_mavros. More on this in the next part.


From left to right: T265 frame, ENU frame (ROS side), NED frame (FCU side). Whatever orientation of the T265 is, we need the feedback to be in ENU before sending to mavros.

  • vision_to_mavros package: this package has 3 main functions:
    • Convert ROS topics between realsense-ros and mavros: /tf is the the transformation data which lets the user keep track of multiple coordinate frames over time. Many localization packages provide this type of data out-of-the-box. /tf will be converted to /mavros/vision_position/pose, which is now supported by ArduPilot.
    • Limit rate of pose data: 200Hz data stream seems to fast for the FCU to handle. We also don’t need the full stack of 200Hz of data. 10-15Hz localization data is the bare minimum threshold that is required for testing, and 30Hz should be more than enough for most cases (relative to how fast the vehicle is moving). You can change the rate data with output_rate param.
    • Frames alignment: The position feedback from realsense-ros is already in ROS convention (ENU). However, by default the T265 body’s X-axis will be aligned with world’s X-axis initially, which means the FCU will interpret heading as facing “East”, or yaw will be off by 90 degrees (left image, below). This package will rotate the body frame by 90 degree around world’s Z-axis, which would conform to ROS’ ENU convention (right image, below).

Note 1: there are hidden APIs that can make things much easier
According to this thread: there is a way to handle frames alignment within librealsense that allows users to define the forward direction in T265’s body coordinate system and then define which direction in the world coordinates it could be lined up with initially. However, this functionality is not yet exposed for outside use except by Intel. Once that becomes available, things can be much more simplified.

Note 2: you can do everything within realsense-ros if you want
You can also do everything (publish correct topics, align frames) by modifying the data and topic within realsense-ros, specificly by changing this file: https://github.com/IntelRealSense/realsense-ros/blob/2b3365f5c96fcafa53412881e8e4484a6648bc23/realsense2_camera/src/base_realsense_node.cpp#L1276

Note 3: you can do everything without ROS if you want
The green boxes in the data flowchart are ROS nodes. If you want to do away with ROS, make your own application that can grasp pose data from librealsense’s API and send to ArduPilot, using your favorite language / framework (here’s the full supported list).

That’s all for the explanation. Let’s make it happen.

3. ArduPilot parameters

  • FW version: tested with ArduCopter-stable 3.6.9. Latest stable version can be download from Mission Planner or here.

Note on FMU version (extracted from this comment by @rmackay9): The fmuv3 firmware is the one to use for the Pixhawk1 and Pixhawk2/Cube boards. The only real difference between the “fmuv2” and “fmuv3” is that “fmuv2” has a few less features so that it can fit onto the older Pixhawks with the 1MB limit. The ChibiOS builds are all below 1MB so even the fully features firmware fits on the older Pixhawks now.

  • Which EKF is supported? I have tested with EKF2. The following params are for EKF2. As of this writing, EKF3 is not handling external navigation data until this PR is merged.

  • Params:


# For EKF2                                   

AHRS_EKF_TYPE = 2                            

EK2_ENABLE = 1                            

EK3_ENABLE = 0                             

EK2_GPS_TYPE = 3                        

EK2_POSNE_M_NSE = 0.1            

EK2_VELD_M_NSE = 0.1             

EK2_VELNE_M_NSE = 0.1                     

# Similar for both EKFs

BRD_RTC_TYPES = 2

GPS_TYPE = 0

COMPASS_USE = 0

COMPASS_USE2 = 0

COMPASS_USE3 = 0

SERIAL5_BAUD = 921 (the serial port used to connect to Raspberry Pi)

SERIAL5_PROTOCOL = 1

SYSID_MYGCS = 1 (to accept control from mavros)

Note for developers (many thanks to @anbello for pointing these out):

4. ROS packages installation

Installation:


# Navigate to catkin workspace

cd ~/catkin_ws/src

# Clone and build the repo

git clone https://github.com/hoangthien94/vision_to_mavros.git

cd ..

catkin_make

# Add source command to .bashrc, if you have not done so

# echo "source ~/catkin_ws/devel/setup.bash" >> ~/.bashrc

source ~/.bashrc

  • Modify parameters of vision_to_mavros for your T265’s orientation:
    The default parameters are set for a front-facing camera. Simply modify the launch file t265_tf_to_mavros.launch: change the params roll_cam, pitch_cam, yaw_cam, gamma_world with values corresponding to your setup:

    • T265 facing front or down
    • T265’s USB port pointing to which side of the vehicle

Important note for down-facing configurations: there is an ongoing issue about inconsistent yaw angle when T265 is facing down and some workarounds have been discussed. As of this writing, here is what seems to work: the camera needs to be slightly tilted (i.e. not completely flat out) when it starts streaming poses (launching realsense-ros or calling librealsense’ API to invoke pose data), otherwise the yaw angle of the world coordinates might be randomly initialized, as shown below:
Left: T265 facing down, completely flat at launch - inconsistent yaw angle between runs
Right: T265 facing down, slightly tilted at launch - consistent yaw angle.
image

Verify that each ROS node is working:

There are 3 nodes running in this setup. Launch in 3 separated terminals on RPi:

  • T265 node: roslaunch realsense2_camera rs_t265.launch
    • The topic /camera/odom/sample/ and /tf should be published at 200Hz.
  • MAVROS node: roslaunch mavros apm.launch (with fcu_url and other parameters in apm.launch modified accordingly).
    • rostopic echo /mavros/state should show that FCU is connected.
    • rostopic echo /mavros/vision_pose/pose is not published
  • vision_to_mavros node: roslaunch vision_to_mavros t265_tf_to_mavros.launch
    • rostopic echo /mavros/vision_pose/pose should now show pose data from the T265.
    • rostopic hz /mavros/vision_pose/pose should show that the topic is being published at 30Hz.

Verify that ArduPilot is receiving data from MAVROS

Check that ArduPilot is receiving position data by viewing the topic VISION_POSITION_ESTIMATE on GCS. For Mission Planner, press Ctrl+F and click on “Mavlink Inspector”, you should be able to see data coming in.

Note on viewing VISION_POSITION_ESTIMATE topic on QGC: check this thread if you have some problems with receiving this topic on QGC.

Launching all nodes at once

Once all nodes can run successfully, next time you can launch all of them at once with roslaunch vision_to_mavros t265_all_nodes.launch, which will launch:

  • rs_t265.launch as originally provided by realsense-ros, no modification.
  • apm.launch modified with your own configuration.
  • t265_tf_to_mavros.launch with parameters according to your T265’s orientation.

5. Workflow process

With all the ROS packages working and ArduPilot parameters configured, we can finally start testing and flying. However, quite a number of steps need to be carried out in specific order, enough so that I think a dedicated section is required to clarify all the details involved.

To better understand the workflow, below is a diagram showing how a successful run might look like. This workflow can also apply to other external navigation systems (mocap, vicon, aruco tags, april tags etc.), with the main difference being the green box. Specifically, for different systems you need to change:

  • ROS wrapper for sensors/devices.
  • Localization package.
  • Data bridge and frame alignment between the localization package and MAVROS.

For the Realsense T265, the ROS driver is realsense-ros, the localization package is running onboard the device itself, vision_to_mavros will bridge the topics between realsense-ros and mavros, lower the data frequency and align the frame.

Note: this workflow is gathered from my reading on the forum and verified by experiments. Obviously, I might be wrong / missing some important details. Any feedback would be greatly appreciated!

Let’s start at the beginning, after you connect the battery:

The next section will discuss the actual operation (ground test, flight test etc.) of the vehicle, i.e. the yellow block in the flowchart.

6. Ground test and flight test

Follow the last section until you see the quadcopter icon appears on the map, zoom in to maximum.

Ground test:

  • Hold the vehicle up, move around and observe the trajectory of the vehicle on Mission Planner.
  • Do a back and forth, left and right, square or any pattern you like. The trajectory of the vehicle on the should reflect accordingly without too much distortion or overshoot.

First flight test:

For the first few flight tests, I think it’s best to do an overall check:

  • Confirm that position feedback is running ok before switching to Loiter mode.
  • Verify tracking performance (scale, oscillation etc.).
  • Look out for the working boundary in your environment, i.e. where tracking might be lost due to lack of features, fast or rotation movement.

Here is what I would suggest:

  • Takeoff in Stabilize or Alt-Hold, check that the vehicle is stable.
  • Move the vehicle around and observe the position on Mission Planner as well as rviz to see if tracking is stable.
  • Switch to Loiter, but always ready to switch back to Stabilize/Alt-Hold if anything goes awry.
  • Otherwise, the vehicle should hover stably and able to keep its position.
  • Move the vehicle back and forth 2-3 meters, verify the scale (easier to view on rviz).
  • Move the vehicle around (translate, rotate) at varying speed, always ready to switch back to Stabilize/Alt-Hold.

Flight tests:

Once you feel confident, you can take off directly in Loiter.

7. Conclusion and next steps

In this blog we have taken a closer look at our overall system, install necessary packages, configure ArduPilot and ROS params to use ArduPilot with the Intel Realsense T265 tracking camera. Ground test and flight test can be carried out after user has gotten familiar with the working procedure.

Next time, in lab 4, we will explore some simple autonomous experiments. Since we are using ROS, numerous examples are available, but we will start with these scripts, many thanks to @anbello for this awesome work. Then in lab 5 we will discard the ROS framework and use python wrapper instead to capture data from the T265 and communicate with ArduPilot directly. You can refer to my first blog for an updated list of all the labs once they are out.

Hope this helps. Happy flying!

References

ROS frame convention: http://www.ros.org/reps/rep-0105.html

ENU-NED coordinate: https://dev.px4.io/en/ros/external_position_estimation.html#setup_specific_systems

Some Informative papers:

4 Likes
Integration of ArduPilot and VIO tracking camera (Part 3) - Indoor autonomous flights and performance tests
(ppoirier) #2

Thanks again for this step-by-step and very detailed blog on how to implement the T265 with ROS on ArduPilot .

1 Like
(Winston AZ) #3

Great, impressive work, congratulations and success.

1 Like
(Andrea Belloni) #4

Congratulations @LuckyBird the post is rigorous and easy to follows at the same time.

I see that you use ArduCopter-stable 3.6.9 so you can’t use the EK2_EXTNAV_DELAY parameter.
The relative PR was merged on 3.7dev.
Don’t you see problems related to delay between the time at which you get the pose and the time at which the pose is fused in the EKF?

Another question: from what I know EKF3 don’t handle the external navigation messages, there is a PR for this that waits to be merged. Are you using a personal fork to test EKF3 or this is on a to do list?

Cheers
Andrea

4 Likes
(Thien Nguyen) #5

Thank you @anbello for your reply.

  • I am working on the latest stable so that anyone with the bare minimum hardware can immediately follow. The EK2_EXTNAV_DELAY parameter is definitely something on my to-do list.
  • I do receive the messages “TM: RTT too high for timesync” and “TM: Wrong FCU time”. Correct me if I am wrong, but my understanding is that this behavior is something to be expected. This shouldn’t stop the EKF from working.
  • Regarding the EKF3: it seems I have made a mistake. The EK3 was not using external navigation data, but if FLOW_ENABLE is 1 the quad icon stills appear on the map when I click “set EKF home”. My flight test with EKF3 (not in the video) was probably with just optical flow. I will update the blog. Thank you for pointing out my mistake!
2 Likes
(Thien Nguyen) #6

@anbello I have tried with 3.7-dev firmware. EK2_EXTNAV_DELAY = 10ms (default). The flights seem even more stable with this setting. Here’s a few plots:

  • Position x, y, z:

  • Roll and pitch:

2 Likes