Using MAVCONN library for sending non-GPS Navigation Messages in C/C++

MAVCONN library (libmavconn) is the mavlink connection and communication library used in MAVROS that could be used also outside the ROS environment.

I used it in two project related to non-GPS Navigation systems based on Visual Position Estimation and I would like to share with Ardupilot community my experience.

The first project (aruco_mavconn) is the porting to (non-ROS) C/C++ of the positioning system based on Aruco Boards described in this wiki page (Indoor autonomous flight with ArduCopter, ROS and Aruco Boards Detection — Dev documentation).

The intent was that of learning how to use libmavconn and a little bit more of OpenCV while at the same time obtaining a lighter application to run on Single Board Computer.

The second project (t265_mavconn) is the porting to C/C++ of the Python script written by Thien Nguyen (@LuckyBird) to send VISION_POSITION_ESTIMATE messages from the T265 VIO device to Flight Controller after transforming the reference frame to NED. The work done by Thien Nguyen is documented in this wiki page (Intel RealSense T265 — Copter documentation).

Other then porting to (non-ROS) C/C++ this two systems I added the possibility to choose between two way of sending the Visual Position Estimation messages:

  1. using EKF2 + sending VISION_POSITION_ESTIMATE message;
  2. using EKF3 + sending GPS_INPUT message.

The use of GPS_INPUT message permit us to send information about velocity other then position and attitude. In adding this feature I used as reference the Vicon MAVProxy module written by @Tridge for the Vicon Indoor Positioning System documented here. From this module I used the part relative to time and coordinate transformation (from local NED to GPS).

Both projects has been tested on Copters with Raspberry Pi 3 (with Ubuntu 16.04) used as Companion Computer. aruco_mavconn has been tested also on a desktop PC with Ubuntu 18.04 using a small (3 inch) Copter with Raspberry Pi Zero W connected in WiFi to The PC. The configuration relevant to this last system is explained in this post Indoor autonomous flight with Arducopter, ROS and Aruco Boards Detection.

For the connection of the Companion Computer to Flight Controller and relative settings see relevant information on the wiki pages:

I have to say that I am not so good at C++ object oriented programming, so the programs consist mainly of one source file with all functions and logic in there. I hope to find the time to refactor the sources in a better form.

aruco_mavconn

aruco_mavconn depends on OpenCV and MAVCONN.

To build OpenCV form source follow the instruction at:

https://docs.opencv.org/master/d7/d9f/tutorial_linux_install.html

To build MAVCONN follow the instruction in the relative paragraph of this post.

After installing the prerequisites it is possible to built the project in this way:

$ git clone https://github.com/anbello/aruco_mavconn
$ cd aruco_mavconn
$ mkdir build
$ cd build
$ cmake ..
$ make
$ cp ../data/* .

To start the program:

$ detect_board_mavconn –lp=../data/launch.yml

Here an example of launch file (lines beginning with ‘#’ are ignored):

%YAML:1.0
dictionary: 16
camera_param: "calib-raspicam2.3.yml"
#video_pipeline: "udpsrc port=9000 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=JPEG,a-framerate=30,payload=26 ! rtpjpegdepay ! decodebin ! videoconvert ! appsink"
video_pipeline: ""
camera_id: 0
offset_x: 0.0
offset_y: 0.05
offset_z: 0.0
board_layout: "layout-my.yml"
detector_param: "detector_params.yml"
refined_strategy: 0
show_rejected: 0
mavconn_url: "tcp://192.168.10.16:2000"
#mavconn_url: "/dev/ttyACM0:115200"
vision_gps_msg: 2 # 0 -> no msg | 1 -> vision msg | 2 -> gps msg
pose_msg_rate: 15
show_image: 1

Video source can come from a gstreamer pipeline (if video_ pipeline string is not null) or v4l2 device.

In the data directory there are some example of camera calibration file, board layout file and detector parameter file.

t265_mavconn

t265_mavconn depends on librealsense, OpenCV and MAVCONN.

To build librealsense follow the instruction at:

librealsense/doc/installation.md at master · IntelRealSense/librealsense · GitHub

To build OpenCV form source follow the instruction at:
https://docs.opencv.org/master/d7/d9f/tutorial_linux_install.html

To build MAVCONN follow the instruction in the relative paragraph of this post.

After installing the prerequisites it is possible to built the project in this way:

$ git clone https://github.com/anbello/t265_mavconn
$ cd t265_mavconn
$ mkdir build
$ cd build
$ cmake ..
$ make
$ cp ../data/* .

To start the program:

$ t265_mavconn –lp=launch.yml

Here an example of launch file (lines beginning with ‘#’ are ignored):

%YAML:1.0
camera_orientation: 0
pose_msg_rate: 15
vision_gps_msg: 1 # 0 -> no msg | 1 -> vision msg | 2 -> gps msg
offset_x: 0.10
offset_y: 0.0
offset_z: 0.0
scale_factor: 1.0
#mavconn_url: "tcp://192.168.10.16:2000"
mavconn_url: "/dev/ttyACM0:115200"

Useful parameters

On both programs the parameter vision_gps_msg (in launch file) permit to choose between VISION_POSITION_ESTIMATE (1) messages or GPS_INPUT (2) messages. With vision_gps_msg: 2 both pose and velocity are sent to ArduCopter. In aruco_mavconn the velocity is calculated from actual and previous position, in t265_mavconn the velocity come from the T265 device using librealsense.

When using VISION_POSITION_ESTIMATE message the following parameter should be set in ArduCopter:

AHRS_EKF_TYPE = 2
EK2_ENABLE = 1
EK3_ENABLE = 0
EK2_GPS_TYPE = 3
EK2_POSNE_M_NSE = 0.1
EK2_VELD_M_NSE = 0.1
EK2_VELNE_M_NSE = 0.1
GPS_TYPE = 0
COMPASS_USE = 0
COMPASS_USE2 = 0
COMPASS_USE3 = 0

When using GPS_INPUT message the following parameter should be set in ArduCopter:

AHRS_EKF_TYPE = 3
EK2_ENABLE = 0
EK3_ENABLE = 1
EK3_ALT_SOURCE = 2
EK3_GPS_TYPE = 0
EK3_MAG_CAL = 5
EK3_POSNE_M_NSE = 0.1
EK3_VELD_M_NSE = 0.1
EK3_VELNE_M_NSE = 0.1
GPS_TYPE = 14
GPS_DELAY_MS = 50
COMPASS_USE = 0
COMPASS_USE2 = 0
COMPASS_USE3 = 0

Building MAVCONN library on Linux (Ubuntu 16.04 / 18.04)

$ sudo apt-get install catkin libconsole-bridge-dev libboost-dev libboost-system-dev python-dev python3-dev python-future

$ git clone https://github.com/mavlink/mavlink-gbp-release.git
$ cd mavlink-gbp-release
$ git checkout release/melodic/mavlink
$ mkdir build
$ cd build
$ cmake ..
$ make

$ cd ../..

$ export mavlink_DIR=./mavlink-gbp-release/build

$ git clone https://github.com/mavlink/mavros
$ cp -r mavros/libmavconn .
$ cd libmavconn
$ mkdir build
$ cd build
$ cmake ..
$ make
$ sudo make install

$ cd ../..

$ sudo mkdir /usr/local/include/mavlink
$ sudo cp -r mavlink-gbp-release/build/include/v2.0/ /usr/local/include/mavlink
7 Likes

We want videos now ! :stuck_out_tongue_winking_eye:

1 Like

This is made using aruco_mavconn sending GPS_INPUT message with pose and velocity.
On the 3 inch Copter there is a revo-mini, a Raspberry Pi zero and a Raspberry Pi Cam.
The video is streamed to a PC where is running aruco_mavconn via WiFi. So, because of relatively high latency, position control is not optimal.

Landing on end of the video is because of battery failsafe.

Next test:

  • another 3 inch Copter with Raspberry Pi 3 and Raspberry Pi Cam so that the position control loop is closed on board (no problem of WiFi latency);
  • another 5 inch Copter with Raspberry Pi 3 and T265.
2 Likes

very cool ! I glad it works !

1 Like

Another video with a 5 inch Copter with Pixracer, Raspberry Pi 3 and T265. Here I used t265_mavconn sending VISION_POSITION_ESTIMATE messages.

Local position x, y, z from the log of this flight:

This is really interesting, it would be nice if you could compare the results from both T265 methods.

Thanks @anbello for your efforts to make the vision odometry a viable option for navigation system

Thanks to you @ppoirier.

Regarding the two method I have to say that while in the system with Aruco I see an improvement using GPS_INPUT message with velocity this is not true with T265. I have to do more testing and parameter optimization.

Maybe could be useful to filter the velocity before to send it to ArduPilot, I tried with a first order filter but without success, I see that @tridge use a second order filter for velocity with Vicon system.

The T265 gives 200Hz pose and velocity messages, we send this to ArduPilot at maximum 30Hz so we could benefit from filtering, maybe also the pose. I will do this kind of test.

1 Like

I added the possibility to send VISION_POSITION_DELTA message to t265_mavconn in this branch:

In the launch file the param vision_gps_msg should be set to 3.

In ArduCopter I used this params (I list only the important ones or different from default):

AHRS_EKF_TYPE = 3
EK2_ENABLE = 0
EK3_ENABLE = 1
EK3_ALT_SOURCE = 0
EK3_GPS_TYPE = 3
EK3_POSNE_M_NSE = 0.2
EK3_VELD_M_NSE = 0.2
EK3_VELNE_M_NSE = 0.2
GPS_TYPE = 0
COMPASS_USE = 0
COMPASS_USE2 = 0
COMPASS_USE3 = 0
VISO_TYPE = 1
VISO_ORIENT = 0 # facing forwad
VISO_POS_X, VISO_POS_Y, VISO_POS_Z # camera offset

The results are mixed: I can takeoff in Loiter and the Copter has a good attitude and position control, if I move the Copter only with roll, pitch and thrust the control continue to be good but if I turn the Copter with yaw I have toilet bowling.

Is the first time that I try to control the position with the VISION_POSITION_DELTA message so I don’t know if the problem described come from some error in my program or from some wrong or non optimal parameters. Maybe I have to filter the deltas before sending them in the message.

Here the log of a little indoor flight, I would like to have some suggestion to improve the system.

1 Like

I merged in master of t265_mavconn two improvement.

One comes from @LuckyBird (thanks) and is a fix to the code that send the VISION_POSITION_DELTA message, now it works good.

The other is a fourth method to send both GPS_INPUT and VISION_POSITION_DELTA messages setting to 4 the launch parameter vision_gps_msg.
In this way we have a simulation of fusing positioning data coming from GPS with those coming from Visual Odometry using for both pose and velocity data coming fro T265. I think this could be of interest to @rmackay9.

In the little indoor test flight I did I saw a good position and attitude control.
Here is a log.

2 Likes

@anbello thank you for this. I was able to compile your test_mavlink.cpp and confirm communication with my APM/Copter. I am also working on a computer vision application and needed a Cpp connection to mavlink. I have been struggling with implementing a command instead of a message. For example, I would like to send the MAV_CMD_DO_SET_SERVO which is in the common.h as 183, but the compiler keeps stating that the MAV_CMD_DO_SET_SERVO cannot be found. Have you had success with RC overrides/Set servo?

I was trying
mavlink_command_long_t set_servo = {0}; set_servo.target_system=1; set_servo.target_component=0; set_servo.command = int(mavlink::common::MAV_CMD_DO_SET_SERVO); set_servo.Instance = 7; set_servo.PWM = 1400;

I appreciate any input.

I have a error from the first step it gives me an error like :
catkin : depends: python-catkin-pkg but it is not going to be installed e: unable to correct problems, you have held broken packages.
I searched about the error but I couldn’t solve. I tried to install these libs without first step then it gave an error about missing ament_cmake configs in second cmake … command. And ament_cmake is about ROS2 ony but I am using ROS Melodic in Ubuntu 18.04 LTS and it depends catkin.
I will try to use it as a ROS package.