Integration of ArduPilot and VIO tracking camera (Part 1): Getting started with the Intel Realsense T265 on Rasberry Pi 3B

hi i’m a super begineer ,will i be able to do this project

Hello

If you already have a flying vehicle with ArduPilot, You need knowledge of these 3 fundamentals systems:

  • ArduPilot flight controller and how to set advanced parameters and read the logs and setup communication using serial devices
  • Companion Computers , like a RaspBerry Pi and how to configure Linux environment and install packages or build from source
  • Python programming language

You need to go step by step and start with having a fully functional flying platform before trying to implement a project like this. I suggest you start by reading the excellent wiki we have on ArduPilot as most of the stuff you need to know is already there. Actually this series of blogs will be part of the wiki.

Thank you , i will see the link , but now i have problem with pyrealsense2 , i know that sudo pip3 install pyrealsense2 doesn’t work , i tried to rebuild librealsense 2.24.0 with DBUILD_PYTHON_BINDINGS=TRUE in cmake , i have a problem when i do make make[1]: *** [wrappers/python/CMakeFiles/pyrealsense2.dir/all] Error 2

Hi @Tarek-H, I read above that you are using Odroid xu4, correct? In that case, you might need to follow this instruction page.
It looks like Intel does not officially support the Odroid line of devices. Furthermore, there are several known known issues with running librealsense on Odroid, so you will have to try some patches and see how it goes.

Hi , yes i use odroid xu4 , it work very well for me but i use C++ not python ,so i rebuild the library to have the pyrealsense2 but i have error for pyrealsense2 with the make , i try to clean and uninstall completely the library and redo install instruction .

Just want to make sure, are you building the main Cmakelists.txt or the one inside the python wrapper folder?

You should navigate to librealsense root directory, run the top level CMake command with the additional flag -DBUILD_PYTHON_BINDINGS, for example:

cd /path/to/librealsense
mkdir build
cd build
cmake ../ -DBUILD_PYTHON_BINDINGS=bool:true

I am building in the build folder inside the librealsense folder , my librealsense folder is in the catkin_ws /src because before i had problem with the link between the ros and librealsense , now i want to use the pyrealsense so when i use cmake …/ -DBUILD_PYTHON_BINDINGS=bool:true ( in the build folder ) and the make i have internal error because pyrealsense ,and some times during the make execution the odroid is frozen!

I believe that means the odroid runs out of memory during the build. Can you try again with make -j1 instead? Note that this can take quite some time for the build to complete.

yes i use make -j1, the error that i have is
make[2]: *** [wrappers/python/CMakeFiles/pyrealsense2.dir/python.cpp.o] Error 4
CMakeFiles/Makefile2:233: recipe for target ‘wrappers/python/CMakeFiles/pyrealsense2.dir/all’ failed
make[1]: *** [wrappers/python/CMakeFiles/pyrealsense2.dir/all] Error 2
Makefile:127: recipe for target ‘all’ failed

maybe is about the librealsense 2.24.0 and must change version ,or some thing missing in python

Unfortunately, I have tested with Odroid XU4 and did not encounter your issue. Maybe you can delete the build folder and try again.

Below are the steps that worked for me.

Install librealsense and pyrealsense2 on Odroid XU4 running Ubuntu 16.04:

  1. Upgrade kernel to 4.14: I followed the instructions here. After which, my Odroid’s kernel is:
$ uname -a
Linux odroid 4.14.111-139 #1 SMP PREEMPT Tue Apr 16 17:31:00 UTC 2019 armv7l armv7l armv7l GNU/Linux
  1. Build librealsense from source:
# Make Ubuntu Up-to-date: update Ubuntu distribution, including getting the latest stable kernel:
sudo apt-get update && sudo apt-get upgrade && sudo apt-get dist-upgrade 

# Download the complete source tree with git
cd 
git clone https://github.com/IntelRealSense/librealsense.git

# Install the core packages required to build librealsense binaries and the affected kernel modules:
sudo apt-get install git libssl-dev libusb-1.0-0-dev pkg-config libgtk-3-dev 

# Distribution-specific packages, for Ubuntu 16:
sudo apt-get install libglfw3-dev

Important: Unplug any connected Intel RealSense camera before continue:

# Navigate to librealsense root directory to run the following scripts.
cd librealsense

# Run Intel Realsense permissions script located from librealsense root directory:
./scripts/setup_udev_rules.sh

# Odroid XU4 with Ubuntu 16.04 4.14 image Based on the custom kernel provided by Hardkernel 
./scripts/patch-realsense-ubuntu-odroid.sh

# Install Python and its development files via apt-get (Python 2 and 3 both work)
sudo apt-get install python python-dev && sudo apt-get install python3 python3-dev

# Run the top level CMake command with the following additional flag -DBUILD_PYTHON_BINDINGS=bool:true:
cd ~/librealsense
mkdir build
cd build
cmake ../ -DBUILD_PYTHON_BINDINGS=bool:true
make -j4
sudo make install
sudo ldconfig

sudo reboot
  1. Check installation and test pyrealsense2:
$ ls /usr/local/lib

You should see librealsense2.so.2.24 and pyrealsense2.cpython-35m-arm-linux-gnueabihf.so. Then you can test Python code:

# update your PYTHONPATH environment variable to add the path to the pyrealsense library
export PYTHONPATH=$PYTHONPATH:/usr/local/lib

# Navigate to python example folder
cd ~/librealsense/wrappers/python/examples/

# Run examples:
python3 t265_example.py

Then you will see a stream of data coming from the T265, for example:

Frame #49
Position: x: -3.80651e-05, y: -0.000489848, z: 6.75867e-05
Velocity: x: -0.000446847, y: 0.000521675, z: -0.000202993
Acceleration: x: -0.00395986, y: 0.0395705, z: 

Let me know if this works for you.

thank you so much,I fixed the problem for pyrealsense , before i didn’t use git clone https://github.com/IntelRealSense/librealsense.git ( Master branch ) , I downloaded the zip files for specific version (2.24.0) ,it is for realsense-ros specific version too , now with Master branch I managed to build realsense library with pyrealsense , i tested the t265_example.py it work

Hello I’m following this guide, but in my case I’m giving a shot to the Intel Up board with ubuntu 16.04, the step that is giving me some trouble is the installation of the apsync to connect the board to the ardupilot using MAVROS, can you help me with this?

Can you give us more details of your installation process and what errors are you having?

Yes, I was able to set up everything until the apsync, the images provided are only for TX1 TX2, and raspberry pi3, but I’m using the Intel Up board, I’m using the tx rx from the GPIO to connect to my Pixhawk, but the apsync step is a little bit confussing. Thanks for your response

Since Up board is not supported yet in APSync, I think you have to follow the normal process to setup connection with MAVROS. Here’s a rough guideline from this very detailed blog post:

  • Setup the flight controller: Connect the FCU to Mission Planner. Go to Config/Tunning > Full Parameter List and modify the parameters for serial connection, for example:
SERIAL2_PROTOCOL = 1	to enable MAVLink on the serial port.
SERIAL2_BAUD = 57		57600, baud rate can be up to 921000.

The port can be check by typing the following commands in terminal:

ls /dev/tty*

/dev/ttySx or /dev/ttyAMAx should be used to communicate with the FCU. For my case, it’s /dev/ttyS0.

  • Connect to companion computer (Up board in your case): Via USB or GPIO. If you have a USB-to-microUSB cable, simply connect the Up board and FCU. The Up board should be able to recognize the FCU. Run the following code snippet in a terminal to check for connected devices:
for sysdevpath in $(find /sys/bus/usb/devices/usb*/ -name dev); do
    (
        syspath="${sysdevpath%/dev}"
        devname="$(udevadm info -q name -p $syspath)"
        [[ "$devname" == "bus/"* ]] && continue
        eval "$(udevadm info -q property --export -p $syspath)"
        [[ -z "$ID_SERIAL" ]] && continue
        echo "/dev/$devname - $ID_SERIAL"
    )
done
  • Launch MAVROS

Open a terminal. Type:

roslaunch mavros apm.launch fcu_url:=/dev/ttyS0:57600

you may change the launch setting according to you own setting

roslaunch mavros apm.launch fcu_url:=<tty port>:<baud rate>

If permission is denied is shown after roslaunch , type

sudo chmod 666 /dev/<tty port>

Hi!
what do you think about use it on rover, actually i made a boat run rover firmware, i want to run the boat in non-gps envirment. i had consider the difference between rover and copter:
1.rover no height
2.rover turns pivot and the t265 camera will have a sharp turn may lost data.

I have seen some implementations of the T265 on rover, so it’s definitely possible. Keep in mind that this is a vision-based device, here’s my 2 cents:

  • Make sure that the T265 is not located too close to the ground such that the field of view can cover as much of the environment as possible.

  • The tracking quality and the USB connection are heavily affected by vibration, thus damping might be necessary to limit the amount of vibration exerted on the camera. The less vibration the better.

  • Under rapid movement, tracking can become lost temporary. However, the SLAM algorithm on the T265 can relocalize using previously built map of the environment, so the tracking can recover when the movement slows down. If you are running on the same track/environment, you can build a detailed map during testing, save (export) and reuse (import) it later on.

  • Another potential problem is the acceleration range. The Bosch BMI055 IMU used in the D435i and T265 says that the component supports ± 2g, 4g, 8g or 16g. But according to the device datasheet, the device range is ±4g. My guess is if you accelerate / decelerate outside of this range, the IMU will saturate and the SLAM might fail.

1 Like

The T265 requires a rich visual environment nearby with low reflection. Water is a big challenge for visual odometry as it has no specific features and lots of sporadic light flashes caused by sun reflection on waves.

1 Like

appreciate your advise and LuckyBrid’s suggestion, maybe that is not proper to be used in boat, just as you say: 1.the specific features point and flash
2.the vibration and need to be a distance to the ground
thanks you again.

Working on a long range visual triangulation systems , just like the early sailors did could be an interesting project. You could use a long range camera installed on a precise azimuth control system to identify and triangulate from different shore features, this way you could localize and map (SLAM) .