Servers by jDrones

Navigation with Optical Flow Only

Hi Everyone,
I have been trying to get navigation working in a GPS denied setting.
The optical flow sensor PX4flow works fine with a range finder (TFmini in my case), Loiter is quite stable.

But when I try to do missions, I feel like going into unchartered territory. The wiki does not seem to mention anything. I have found a few threads here which talk about it, but it hasn’t worked for me so far.

So here is what I tried. Using 3.5.5 and 3.6-dev I set disabled EK2 and enabled EK3:

EK2_ENABLE 0
EK3_ENABLE 1
AHRS_EKF_TYPE 3

I also turned off the GPS and set the range finder to be the main altitude source:

GPS_TYPE 0
EK3_ALT_SOURCE 1

The parameter I am not sure of is EK3_GPS_TYPE. Apparently it needs to be set to 0 as described here, but I’m not sure if this still is the recommended setting.
https://discuss.ardupilot.org/t/the-copter-does-not-takeoff-in-guided-mode-and-ekf-gps-type-3-optical-flow/16751

Either way, I can’t get the copter to fly a mission. It will arm, but takeoff fails. If I fly loiter and switch to guided it will accept the mode, but then go to smart RTL.

Another thing I find interesting is that SITL behavior seems to be different then what I see in the drone… in the simulation I can’t even get the drone to arm with just the flow sensor, but that might be something different entirely.

Any suggestions are very welcome.

So I haven’t studied how the optical Flow works so I’m going out on a limb here…but when you’re doing optical Flow, and have no GPS, the flow sensor can’t give you lat/long position. It can only detect movement and orientation changes. I suppose if you had GPS, then lost it, then data from the flow sensor could be used by the Kalman filter to "dead reckon) your position while awaiting GPS to return, but your error will be increasing with each calculation. (I’m not familiar with the accuracy of flow sensors) this is essentially what a GPS nav system does when you go under a bridge or through a tunnel. It guesses your location for a while, based on the accelerometers built in.

Again, I haven’t read up on this yet. But my guess is that unless you started with a GPS lock, then temporarily downgraded to Flow only, you won’t have lat/long, therefore there is no way for a mission to work. Missions are based on lat/long and altitude, not number of centimeters moved in x/y/z direction (right?)

When skyviper finally gets around to shipping out their new Journey, I’ll be sure to do some extensive testing. :wink: @Matt_Morton

Thanks for the input.
For sure there will be no absolute location on the globe. But it is possible to navigate relative to my home position. So I can in theory takeoff, move 5 meters forward, turn 90 degrees, move 5 meters again, etc. This is what I am trying to achieve and should be possible to achieve with a script or in a ground control station as well (in the thread I linked it is described for Mission Planner). Yet my tries with QGroundControl were unsuccessful so far.

oh ok. I see that it makes movement based on direction and distance. good stuff. Carry on.

I need to read up on optical flow capabilities more. You really ought to be able to tracking location (assuming you started from a known good location) with optical flow. This is basically the way old Cruise Missles work (built before GPS was a thing). They used cameras to “observe the scene” in order to know where it was on it’s path. They were working on 1960s hardware, so if they can do it, so can we :wink:

1 Like

@moobsen On mission Planner, you can set the ekf home position using the right button of the mouse. You will see a quadcopter appear on the map and if you go maximum zoom you can ‘‘walk’’ the vehicle and you can see it move around on the map. You will notice a lot of drift when the vehicle is stationary, that is the IMU error accumulating within the ekf. Once you move , the velocity estimator (px4flow) can give a pretty good relative position if the ground texture is rich and bright.

I am flying in guided mode only because there is too much drift for an accurate positioning. The next logical step for GPS denied is implementing a SLAM. I am working with a monocular prototype running ROVIO on an ODROID XU4 , its not very expensive…but not very stable at the moment :wink:

Took me a while to setup Mission Planner under Linux, but yup I see the drone moving when I walk around with it. Interestingly it is at the very left of the map and only moves in North/South direction there, but I guess that has something to do with my unfamiliarity with MP.

So if I understand you correctly you are saying there is too much drift to really navigate with just the px4flow? All I really need to do is fly a specific pattern, where I need to be sure to the distances are correct against ground (5 meters this way, 3 that way…)My plan was to implement something with dronekit on a Raspberry. My next step would be trying to navigate the drone in Loiter mode via scripts, and at the same time measure the traveled distances via the px4flow. In theory that should work, right?

Your SLAM project sounds interesting, as all of your projects.

Cool, the you can try to navigate on Guided using Dronekit Python with the RPI.
Get ready to switch to loiter in case the quad get lost, but practice make good and you probably can make a mission similar to the Balloon Finder QuadCopter Object Tracking on a budget using just the Px4FLOW.

Keep us updated on theses experiments, I flew a couple of time opticalflow mode outdoor on loiter and guided, and it could ‘‘go somewhere and come back home’’ with an acceptable error margin. Beware of the TFMINI range == its just 6 meter outdoor and if you go beyound that it might get the quad jump in the air… in this case you need to switch back to stabilize …!!with throttle!!.. so it dont fall like a rock… once again, practice the manual takeover… I generally practice it before I switch to guided :wink:

As for the SLAM, I am still working to make it stable enough so I can publish a new blog and hopefully having you to experiment with it as well.

hi ppoirier
thank you for sharing your experience
coding with dronekit-python api, you said that you fly the the drone in guided mode. Could you please tell me what navigation coordinate(frame) you use for that.
as global and global relative frames are using lat and lon which are absolute frames, should i use local ned frame to work with px4flow?

Hello
As explained above, once you have walked the flight area, you can issue commands from Mission Planner map in guided

1 Like

thanks for the quick response
actually i guess i didn’t make the question clearly.
as you know in order to navigate from one place to another in drone kit ,there should be the destination coordinate, which of course can be chosen between 1.global, 2.global-relative and 3.relative coordinates.
my question is that as flying with an optical flow which enables more precise relative-to-home positioning, which of the three coordinate systems should be chosen.(should i only chose the 3.relative coordinate)

Optical Flow is a velocity estimator so it cannot be used as a positioning system. For this you need to use something like a T265 , just read the many blogs from @LuckyBird

1 Like
Servers by jDrones