GSoC 2018: Realtime Mapping and Planning for Collision Avoidance

There is some hope here - Rtabmap 3D slam with realsense R200 on odroid XU4: https://www.youtube.com/watch?v=m4c8NlFKYqA

This is a little outdated: http://www.ros.org/news/2014/12/ardrone-autonomous-indoor-navigation-by-3d-slam.html

But there is an active mainainer here:

1 Like

But this is so awesome!

It does looks promising, I hope we will soon be able to borrow some great ideas from @Voidminded work for the hardware. I think he is exploring some of the onboard computational resources for this project too.

The video does look great. I think in that case they are using 2d occupancy map and they might have projected depth image to laser scan data which is relatively quite less expensive for fusing into the occupancy map. I also think that the LSD SLAM demo might be running on the ground station, although, if we manage to do it onboard like that then ours would also look quite awesome.

I did run some tests on Odroid today and it took ~0.4s for octomap cloud insertion on it, so I guess we can run it at 2Hz for now. I’ll also take a look into how to parallelize point cloud insertion in octomap, which should improve the performance further.

2 Likes

@ppoirier I think I made a blunder. While benchmarking the time I wasn’t focusing on the time taken for disparity computation since on the desktop octomap was the bottleneck. When I was working on Odroid, I ignored the lag in the disparity computation assuming that it was happening due to X forwarding but today when I was benchmarking all the functions after adding compiler optimizations to my code, I noticed that it is taking ~2.5s for disparity computation itself on Odroid while now it takes only ~0.2s for octomap generation.

If someone has a clue on improving disparity computation on any of arm based hardware please suggest it to me. I am thinking that there might be some way for opencv to utilize the onboard Mali GPU or something. Are there any other SDKs for doing the same on similar mobile hardwares?

I think there are much options for the ODROID than processing the disparity/point cloud from a dedicated asic like in the Intel realsense. The link to Hao-Chih, Lin videos like this one

And the related work https://github.com/jim1993/realsense_camera_odroid should be considered as a good start.

Personnaly I would not try to implement the MALU GPU stuff, unless you want to get in deep trouble, if in doubt , you can ask @fnoop for comments…

1 Like

sir , i have got float value in RTK here and but we can able to get fixed values ,It was taking more time please give any idea for it