Experiment with Visual Odometry - ROVIO

The reason why T265 has outclassed other options is because of its embedded power. The ASIC & Movidius combo can spit 6 DoF @ 200 hz , If you want to achieve that on any other systems you need a TX2 or an Intel Pentium class Companion Computer.

On the Monocular Rovio experiments I used a 350$ processor that is over 200 Grams and required 15 Watts of power, making the 450 size flamewheel quadcopter frame quite overloaded. On the T265 experiments I just needed a Banana Pi Zero making the integration on a 330 size a piece of cake.

Yes, you are right about the computations, but as for now I don’t care about it :slight_smile:

BTW DJI Tello also uses Movidus 2

This might be interesting
https://devtalk.nvidia.com/default/topic/1070644/jetson-projects/real-time-monocular-vision-based-slam-with-nvidia-jetson-cnn-and-ros/

fast-depth improved FCRN-ResNet50 with skip connections. In general I think people working on this topic should study recent sematic segmentation stuff.
And this is without IMU fusion

@ppoirier how about using Pixhawk IMU and camera shutter configuration (https://ardupilot.org/copter/docs/common-camera-shutter-with-servo.html)?

See also https://docs.px4.io/v1.9.0/en/peripherals/camera.html#camera-imu-sync-example-vio

Yes, this option works fine, it has been successfully demonstrated by Kabir Mohammed https://youtu.be/67jV7G2rQrA

Is it possible to do the same with Ardupilot? If so how do I configure Ardupilot to do this?

I’ve stumbled across this while searching for Allan Variance information. Could you provide more details on the parameters you used to log the data? I’m getting the same data for Orange and Black Cubes.

I have not characterized the IMU onboard FC , I used a MPU9250 and captured IMU data for an an extended period once it was warm. Then I processed the bagfile in oredr to generate the curves and numbers, and extracted these to feed the EKF filter.

That makes way more sense than trying to get ArduPilot to do it. I’ve managed to get samples written to the SD card at 1kHz (out of 8kHz from fast sampling), but that still doesn’t make the cut for inputs needed for Allan Variance. There might be a trick I am missing to get the full data written to the SD card.

Maybe you could ask @tridge on the dev channel: https://gitter.im/ArduPilot/ardupilot
I know he characterized some new FC IMU recently

Very nice work! Thank you for sharing. Any additional advise on tuning the prediction noise parameter? In my case ROVIO seems to be stable but the trajectory seems to have a scaling issue depending on how fast I move. Your help is appreciated!

Thanks

With monocular odometry, scale is dependant of IMU and with cheap MEMS units that we are using, device parametrisation is an art…

Depending on the use case, you can get good (…better) results when using the APM EKF filter as vision_position_estimate will be fused with the other states.

Thank ppoirier for the quick answer. You set the bar for parameterization pretty high. Any recommendation of depthType setting considering a cheaper IMU in my case?

May I suggest that you ask directly on the ROVIO github ?
It’s been a long time and I dont have a working setup atm.

Regards

2 Likes

Hello there. Can i use the intel T265 as the camera for testing various VIO algorithm? And do I still require to calibrate the T265 manually? Thank you

Yes you can , in that case you eextract video and IMU data using Realsense API.
You can look at @LuckyBird blog on April Tag for example

why not use the imu in FC directly? In fact, I meet some trouble when I use the imu in FC, the frequency of imu can’t be improved to 200hz.

A directly connected IMU is easier to sync as it is interrupt driven and just need a simple routine to decode with fixed delay. For FC based IMU you need to decode Mavlink message and associated time code to sync properly.

Other advantage is the tight and fixed integration of the VIO unit, making it much easier to calibrate.

1 Like