Integration of ArduPilot and VIO tracking camera (Part 1): Getting started with the Intel Realsense T265 on Rasberry Pi 3B

Unfortunately, I have tested with Odroid XU4 and did not encounter your issue. Maybe you can delete the build folder and try again.

Below are the steps that worked for me.

Install librealsense and pyrealsense2 on Odroid XU4 running Ubuntu 16.04:

  1. Upgrade kernel to 4.14: I followed the instructions here. After which, my Odroid’s kernel is:
$ uname -a
Linux odroid 4.14.111-139 #1 SMP PREEMPT Tue Apr 16 17:31:00 UTC 2019 armv7l armv7l armv7l GNU/Linux
  1. Build librealsense from source:
# Make Ubuntu Up-to-date: update Ubuntu distribution, including getting the latest stable kernel:
sudo apt-get update && sudo apt-get upgrade && sudo apt-get dist-upgrade 

# Download the complete source tree with git
cd 
git clone https://github.com/IntelRealSense/librealsense.git

# Install the core packages required to build librealsense binaries and the affected kernel modules:
sudo apt-get install git libssl-dev libusb-1.0-0-dev pkg-config libgtk-3-dev 

# Distribution-specific packages, for Ubuntu 16:
sudo apt-get install libglfw3-dev

Important: Unplug any connected Intel RealSense camera before continue:

# Navigate to librealsense root directory to run the following scripts.
cd librealsense

# Run Intel Realsense permissions script located from librealsense root directory:
./scripts/setup_udev_rules.sh

# Odroid XU4 with Ubuntu 16.04 4.14 image Based on the custom kernel provided by Hardkernel 
./scripts/patch-realsense-ubuntu-odroid.sh

# Install Python and its development files via apt-get (Python 2 and 3 both work)
sudo apt-get install python python-dev && sudo apt-get install python3 python3-dev

# Run the top level CMake command with the following additional flag -DBUILD_PYTHON_BINDINGS=bool:true:
cd ~/librealsense
mkdir build
cd build
cmake ../ -DBUILD_PYTHON_BINDINGS=bool:true
make -j4
sudo make install
sudo ldconfig

sudo reboot
  1. Check installation and test pyrealsense2:
$ ls /usr/local/lib

You should see librealsense2.so.2.24 and pyrealsense2.cpython-35m-arm-linux-gnueabihf.so. Then you can test Python code:

# update your PYTHONPATH environment variable to add the path to the pyrealsense library
export PYTHONPATH=$PYTHONPATH:/usr/local/lib

# Navigate to python example folder
cd ~/librealsense/wrappers/python/examples/

# Run examples:
python3 t265_example.py

Then you will see a stream of data coming from the T265, for example:

Frame #49
Position: x: -3.80651e-05, y: -0.000489848, z: 6.75867e-05
Velocity: x: -0.000446847, y: 0.000521675, z: -0.000202993
Acceleration: x: -0.00395986, y: 0.0395705, z: 

Let me know if this works for you.

thank you so much,I fixed the problem for pyrealsense , before i didn’t use git clone https://github.com/IntelRealSense/librealsense.git ( Master branch ) , I downloaded the zip files for specific version (2.24.0) ,it is for realsense-ros specific version too , now with Master branch I managed to build realsense library with pyrealsense , i tested the t265_example.py it work

Hello I’m following this guide, but in my case I’m giving a shot to the Intel Up board with ubuntu 16.04, the step that is giving me some trouble is the installation of the apsync to connect the board to the ardupilot using MAVROS, can you help me with this?

Can you give us more details of your installation process and what errors are you having?

Yes, I was able to set up everything until the apsync, the images provided are only for TX1 TX2, and raspberry pi3, but I’m using the Intel Up board, I’m using the tx rx from the GPIO to connect to my Pixhawk, but the apsync step is a little bit confussing. Thanks for your response

Since Up board is not supported yet in APSync, I think you have to follow the normal process to setup connection with MAVROS. Here’s a rough guideline from this very detailed blog post:

  • Setup the flight controller: Connect the FCU to Mission Planner. Go to Config/Tunning > Full Parameter List and modify the parameters for serial connection, for example:
SERIAL2_PROTOCOL = 1	to enable MAVLink on the serial port.
SERIAL2_BAUD = 57		57600, baud rate can be up to 921000.

The port can be check by typing the following commands in terminal:

ls /dev/tty*

/dev/ttySx or /dev/ttyAMAx should be used to communicate with the FCU. For my case, it’s /dev/ttyS0.

  • Connect to companion computer (Up board in your case): Via USB or GPIO. If you have a USB-to-microUSB cable, simply connect the Up board and FCU. The Up board should be able to recognize the FCU. Run the following code snippet in a terminal to check for connected devices:
for sysdevpath in $(find /sys/bus/usb/devices/usb*/ -name dev); do
    (
        syspath="${sysdevpath%/dev}"
        devname="$(udevadm info -q name -p $syspath)"
        [[ "$devname" == "bus/"* ]] && continue
        eval "$(udevadm info -q property --export -p $syspath)"
        [[ -z "$ID_SERIAL" ]] && continue
        echo "/dev/$devname - $ID_SERIAL"
    )
done
  • Launch MAVROS

Open a terminal. Type:

roslaunch mavros apm.launch fcu_url:=/dev/ttyS0:57600

you may change the launch setting according to you own setting

roslaunch mavros apm.launch fcu_url:=<tty port>:<baud rate>

If permission is denied is shown after roslaunch , type

sudo chmod 666 /dev/<tty port>

Hi!
what do you think about use it on rover, actually i made a boat run rover firmware, i want to run the boat in non-gps envirment. i had consider the difference between rover and copter:
1.rover no height
2.rover turns pivot and the t265 camera will have a sharp turn may lost data.

I have seen some implementations of the T265 on rover, so it’s definitely possible. Keep in mind that this is a vision-based device, here’s my 2 cents:

  • Make sure that the T265 is not located too close to the ground such that the field of view can cover as much of the environment as possible.

  • The tracking quality and the USB connection are heavily affected by vibration, thus damping might be necessary to limit the amount of vibration exerted on the camera. The less vibration the better.

  • Under rapid movement, tracking can become lost temporary. However, the SLAM algorithm on the T265 can relocalize using previously built map of the environment, so the tracking can recover when the movement slows down. If you are running on the same track/environment, you can build a detailed map during testing, save (export) and reuse (import) it later on.

  • Another potential problem is the acceleration range. The Bosch BMI055 IMU used in the D435i and T265 says that the component supports ± 2g, 4g, 8g or 16g. But according to the device datasheet, the device range is ±4g. My guess is if you accelerate / decelerate outside of this range, the IMU will saturate and the SLAM might fail.

1 Like

The T265 requires a rich visual environment nearby with low reflection. Water is a big challenge for visual odometry as it has no specific features and lots of sporadic light flashes caused by sun reflection on waves.

1 Like

appreciate your advise and LuckyBrid’s suggestion, maybe that is not proper to be used in boat, just as you say: 1.the specific features point and flash
2.the vibration and need to be a distance to the ground
thanks you again.

Working on a long range visual triangulation systems , just like the early sailors did could be an interesting project. You could use a long range camera installed on a precise azimuth control system to identify and triangulate from different shore features, this way you could localize and map (SLAM) .

Thank you! Worked well using the USB method, however also tried with the serial pins, but there is no connection, I’m using the gnd tx and rx ports, but I’m not sure about de tty port asigned in de up board.

Using the USB, I can stablish the connection and start sending the pose estimation from the T265 camera, but in the QGControl interface in ubuntu there is no widget button to watch the Mavlink inspector and see if there ArduPilot is receiving position data by viewing the topic VISION_POSITION_ESTIMATEalso in mission planner Ctrl+F is not working, I downloaded both from the official sites, in Windows I’m able to open te Mavlink inspector but since I’m using the USB I cannot connect both, companion computer and my laptop. Any advice?

For QGroundControl you can use MAVLink Analyzer widget, see here.

@LuckyBird I believe widgets were removed on the latest versions (I’m on master)

@LuisVale Thanks for the info. Do you know of any alternative functions in the latest version for it?

I just downloaded the executable file from here and the widgets are still there, so @Eduardo92 you can still give it a try.

Thank you! It is working now! Also I was able to configure the bluetooth hc-05 module! The Estimate pose is being recieve by the ardupilot, I’m ready to fly! just one more thing, I’m starting the modules using ssh in my laptop, there is a why to get the rviz window trough ssh? similar as the python windows?

I am trying to use T265 camera with raspberry pi 3 for a similar project as described in this post, but get below error after some minutes:

19:57:13.679 [2601] [E] Device-63B0: FW crashed - got error in interrupt endpoint thread function: status = -1 (LIBUSB_ERROR_IO), actual = 0
19:57:13.932 [2592] [E] Device-63B0: State [ACTIVE_STATE] got event [ON_ERROR] ==> [ERROR_STATE]
19:57:13.932 [2592] [E] Device-63B0: Entered state [ERROR_STATE]

I tried with ubuntu mate and ubuntu server as well as updating realsense library (I used v2.23, v2.24 and 2.27), but got same error.

Did anybody face the same issue?

Hi @jcgarciaca, this might be a USB cable issue.

  • Are you using the original USB cable came with the T265?
  • Are you using a USB hub? Can you try connecting directly to the computer’s USB3.0 port and see how it goes?

Hi @LuckyBird, thank you for your answer.

  • Yes, I am using the original cable
  • I am not using any hub, but connecting directly to raspberry USB port. I tried with my PC (USB3.0 port) and it doesn’t show this error. However the raspberry only has USB2.0 ports.

@jcgarciaca are you running some applications that require images? If so, then USB2.0 is not enough since USB3.0 is mandatory to access camera streams, thus RPi will crash. For the use case that is described in this blog, we are not using image but only pose streams, hence the RPi is sufficient.