ArduPilot + Openkai + ZED for non-GPS navigation

This is the first couple of test flights of ArduPilot (Copter-3.6-dev) paired with LAB enRoute’s open source (AGPLv3) vision system called “OpenKai” which uses the ZED stereo camera to allow the vehicle to control it’s position without requiring a GPS.

More testing is required of course but it seems to hold position quite well especially when there are objects within 5m ~ 10m in front of the vehicle. Even when objects are further away it works fine although we see movement of a few meters especially when the vehicle is yawing. It’s possible we may be able to improve this with a better camera calibration and other software improvements.

You’ll see from the picture below, I’m using my regular enRoute EX700 development copter which has a StereoLab’s ZED camera mounted facing forward. On the back of the vehicle is an NVidia TX1 mounted on an Auvidia.com J120 board and connected to a Pixhawk1 as shown on our developer wiki. There is a Here+ RTK GPS on the vehicle but it’s disabled (see the blue LEDs) and it’s there because I wanted to continue to use the external compass. There’s also an (optional) downward facing LightWare lidar.

This is a bit of a breakthrough because we are taking the 3D visual odometry information from the ZED camera at 20hz and pushing it into ArduPilot’s EKF3 which consumes it much like it consumes optical flow data. The advantage over optical flow is that the position estimate is 3-dimensional instead of just 2-dimensional and also because the camera is stereo, we do not need a range finder. It’s also possible to simultaneously use the stereo camera’s depth information for object avoidance although during these early tests we ran into performance problems on the tx1 when trying to simultaneously do both visual odometry and object avoidance so we may need a tx2 for that :-).

Because the camera’s data is integrated into the EKF, all the existing flight modes work just as they do now. Also this can be used together with a GPS which we hope will improve position reliability and hopefully allow flying seamlessly from a GPS environment to a non-GPS environment. Imagine autonomously flying a vehicle up over some hills to a railway tunnel, then fly right through it and out the other side.

In this case the camera was pointed forward but once we push this to “master” it will support at least 6 orientations (up, down, left, right, forward, back).

Note that this approach is a little different than SLAM because we don’t create a 3D map to determine the vehicle’s absolute position in the environment - we are instead integrating position changes. This is slightly simpler than a full SLAM solution but in any case it allows flying indoors.

I plan to make a tx1 image available so people can get going quickly but until then, I used APSync as the base image and then installed and built OpenKAI as described here.

The code is currently here in my ardupilot and mavlink repos but will go into master once it has been peer reviewed. It should be released official as part of Copter-3.6.

Thanks very much to Kai Yan and Paul Riseborough (EKF) for putting this together (I helped integrate).

12 Likes

In case, you are not using SLAM. What is the reason of using expensive ZED stereo camera?

Hi Hattori-san,
The 3D camera basically works much like optical flow but in 3-dimensions. It holds position without a GPS and interestingly the camera does not need to be facing downwards like an optical flow sensor would. the stereo camera can also be used without a range finder. In fact, it provides depth information which can be used as if it was a range finder to stop before reaching an object.

Thanks. If the affordable Intel real sense R200 could replace ZED camera that will be nice for the purpose. In reality, I have already got TX2 from arrows, instead of Beagleborn Blue. Which means ready to on board, Kai platform:-)

Hattori-san,

Great. I’ll be making tx1 images in the near-ish future to help people replicate this.
Intel has sent a couple of us Intel Aeros to play with so it’s possible we will make it work there. No promises but I’m sure more solutions will be coming with lower price tags. Hardware also always gets cheaper right?

Please make image compatible for TX2. Anyway, RealSense needs some trick to run on nVidia platform. RealSense should work comfortable on Intel platform such as Intel Aero and Up board. I could easily run on Intel NUC desktop. BTW, Intel Aero is not available in Japan. Again, we are not in the list of shipping. I suspect kind of TELEC issue in Japan.
Anyway, Up board is very compact and variety of CPU and storage in selection. I think the APSync is the way to go.

1 Like

I’ve created a wiki page detailing the setup. I haven’t produced a TX2 image but the TX1 image is available now.

1 Like

Hi, may I know which Jetpack version did you flashed in the TX1?

Thank you.

Dennison,

I used 2.3.1. To use the pre-built image it needs to be 2.3.1 flashed on the TX1. By the way, in the image the forward facing distance measurement is also used simultaneously for object avoidance.

-Randy

Is it possible to use “fly to here” command with this setup (without gps). Also does it support RTL?
and where can i get TX2 image?

There is what looks like a beta APSYNC image for TX2 at http://firmware.ardupilot.org. But I don’t know if that includes this zed setup.

1 Like

Is it possible to run TX1 firmware on TX2 machine. (just like an OS can run on multiple hardware types??)

Hello @rmackay9. I am very curious about this ZED camera visual odometry based solution. I have few questions if you do not mind. Thank you very much for suggestions and thank you for all hard work done on Ardupilot!

Currently I am using Intel Aero RTF with Ardupilot 3.5.5. I am using px4flow no GPS connected and compass disabled for flying indoors (after some arming related changes in ardupilot code). Ideally I would like to ultimately disable compass (which can be now disabled only for px4flow flight, Alt flight for EKF3). I am planning to try:

  1. Visual odometry, that can give me heading info like solution with Zed camera right? (px4flow is not giving us heading of UAV for EKF, correct)

  2. Two highly precise GPS antennas…or alternative of “GPS simulation” (Marvelmind Ultrasonic, but it is not precise enough for provision of accurate heading I think)

  3. Where to start in code if I would like to integrate Intel R200 located on the Intel Aero RTF?

Hello , I have Intel realsense D435 camera ,can i use OpenKai ? if yes ,what i need to change in a building code ? http://docs.openkai.xyz/tegrabuild.html

Tarek,

The intel realsense isn’t supported on the standard APSync image but it may be possible to use the intel realsense if you build openkai yourself. I’m not sure exactly how to publicly ask Kai questions, perhaps add a question into the OpenKai issues list?