The goal of this project is to build a small, affordable and easy to setup VIO system which can be carried by small drone. Usually, powerful CPU or GPU is required for a VIO system because of image processing and feature tracking jobs. In this project, I will offload computer vision jobs to OAK-D and let Raspberry Pi focus on pose estimation. It is inspired by Sara Lucia Contreras Ojeda’s thesis “Robot pose calculation based on Visual Odometry using Optical flow and Depth map” and SLAM with OAK from Luxonis web site.
Hardware:
Luxonis OAK-D (I think OAK-D Lite is lighter and cheaper therefore more suitable, but I only got a second-hand OAK-D here)
Raspberry Pi 4
Software
ArduCopter 4.4.4 is used here
Slight modified VINS-Fusion
A small programm processing feature tracking result from OAK-D
Another small program sending VIO data to ardupilot
Ardupilot setup
SERIAL1_PROTOCOL 2
SERIAL1_BAUD 1500
EK3_SRC1_POSXY 6
EK3_SRC1_VELXY 6
EK3_SRC1_VELZ 0 (you can set it to 6 after testing result stable enough)
EK3_SRC1_POSZ 1
EK3_SRC1_YAW 6
VISO_TYPE 1
VISO_POS_M_NSE 0.5 (you can lower it after testing result good enough)
VISO_VEL_M_NSE 0.5
VISO_YAW_M_NSE 0.3
VISO_DELAY_MS 60
Thanks again for this, it’s very interesting. I think maybe we should consider putting this on the wiki as a alternative/replacement for the T265 support we have now that the T265 is no longer available.
One of the key issues of the T265 (and also the ModalAI VOXL) is they can behave very badly when they start losing their position estimate. Any opinions on how this system works when things go wrong?
I guess this uses the distances from the OAK-D camera meaning that it only work well if there are objects that it can track within about 15m or so?
BTW, I have been thinking about how to create a VIO system that works at high altitudes for cases where the GPS is lost. I have been thinking of using a downward facing camera gimbal (the Xacti in particular) and then run an optical flow algorithm (running on an RPI4/5) on the video. The idea of using segmentation or feature tracking to improve the optical flow had also crossed my mind. Any advice is greatly appreciated.
I think maybe we should consider putting this on the wiki as a alternative/replacement for the T265 support we have now that the T265 is no longer available.
Thank you, I will try to improve this post.
One of the key issues of the T265 (and also the ModalAI VOXL) is they can behave very badly when they start losing their position estimate. Any opinions on how this system works when things go wrong?
I think it is due to T265 keep sending pose even when it lost vision feature tracking. I am trying to modify vins-fusion source code to avoid this problem
I guess this uses the distances from the OAK-D camera meaning that it only work well if there are objects that it can track within about 15m or so?
Yes.
BTW, I have been thinking about how to create a VIO system that works at high altitudes
Great job with the pose estimation, I’ve been following your previous works with VINS-Fusion and ArduCopter. Did you managed to somehow attenuate the episodes of the pose estimation losing track with changes in the source code?
I’m having trouble with eventual inconsistencies when using a Monocular Camera (20Hz) and RAW_IMU from APM at 100Hz. Perhaps the time synchronization or low imu frequency is affecting the system.
I think maybe we should consider putting this on the wiki as a alternative/replacement for the T265 support we have now that the T265 is no longer available.
Hello, I want to make a bird that can move both inside and outside without GPS Now there is a point 1 I want to use vins but I can’t do anything because I don’t understand what to do 2 Can I not use optical fallow?
Hello
I installed a Luxonis OAK-D Pro camera and a raspberry pi 5 on the copter.
Also installed a Matek Optical Flow + TeraRanger Evo 60m
Configured the receipt and transmission of data on the raspberry pi as described in the documentation chobitsfan. Thanks
In open space, everything works fine.
But my drone, according to its tasks, must fly up to the wall.
Now, when flying up to the wall, the drone loses its position from the OAK-D camera and very quickly flies away according to the coordinates.
Please look at the logs, what is wrong with the settings and how can this be fixed?
The OAK-D Pro camera has Active stereo: IR dot projector. I turned it on and LED use in oak_d_vins_cpp. Now it works with 80% power.
When I bring the drone to the same wall in my hands, everything works fine. Why do the coordinates fly away very quickly during the flight.
The camera has an IMU. VINS-Fusion uses data from IMU to adjust the position?
On the copter CUAV X7 + Optical Flow
How to configure the EKF so that it would detect a problem with the OAK and switch to using, for example, Optical Flow?
When the acc_n & acc_w parameter is increased, the drone stops holding its position in Loyter. It drifts along all three axes.
How else do you think this problem can be solved?
Is it possible to add another mono camera pointing downwards?
VINS-Fusion provides for calibration of the camera and IMU.
Can you tell me, using your version it is necessary to do calibration of the camera, IMU. And how to do calibration? There is no information about calibration in your documentation.
Thanks
so I found my work had a OAKD Lite just laying around and I was thinking of trying this project out, but I had some questions.
Would this project work with the OAKD Lite?
I have an RPi 5, it does not support Ubuntu 20.04 which is required for ROS noetic, would this project work with ROS2? This question was already asked in this thread, but was not answered.
Possible solution could be to run ROS in a docker on newer Ubuntu versions, but still curious if possible to run on ROS2.
Can you comment on how you are powering the OAK-D camera? Are you using both or just one of the Raspi’s 5V pins? I am using only one pin and seems the Raspi crashes whenever the OAK-D turns on. I’d like to avoid using a separate power source (battery) just for the OAK-D + Raspi.