BlueOS Extension for Optical Flow and Precision Landing

blueos-opticalflow-snapshot-very-small
Here are a couple of videos taken recently of new BlueOS extensions for optical flow (wiki) and precision landing (wiki)

You’ll see in the optical flow video that it works quite well up to 17.7m (the limit of the Benewake TFS20-L lidar) although there’s a bit of drift which I suspect it caused by the EK3_FLOW_DELAY’s value of 80ms not matching the actual lag in consuming and processing each image.

A more serious problem is the slightly scary copter movement after yawing about 90deg. This could also be caused by the above mentioned lag or perhaps the camera gimbal’s yaw is lagging behind the vehicle. I think camera gimbals often have a soft yaw response to make the video smoother. I’m fairly confident we can resolve either/both of these issues.

Although performance is not yet perfect, I’m quite excited about the future of BlueOS + ArduPilot because it makes it quite easy for the AP dev team (and anyone else really) to roll-out new features to companion computers in an organised, user-friendly way. We actually have a bit of a backlog of python scripts we want to expose as BlueOS extensions. Next will be:

  1. Verifying that optical flow works at higher altitudes using SRTM data instead of my vehicle’s Benewake TFS20-L lidar. It should because the camera gimbal’s resolution is better than most optical flow sensors and it’s roll + pitch stabilised so we remove the noise of the vehicle’s movement.
  2. Visual follow-me.

The code for these two extensions is open source (see the optical flow code here and precision landing here) and should make it relatively easy for any developer to write an extension that uses the camera gimbal’s RTSP stream and mavlink data.

BTW the setup that I used is detailed in this earlier blog post but in short it has these components:

Special thanks to @snktshrma who helped me with the optical flow algorithm!

13 Likes

encapsulating upcoming developments into blueos extensions sounds promising, looking forward!

2 Likes

That is excellent use of commercial camera for something usefull for the drone… I have been speaking of doing for years but it was always to costly on video or processing…
Well, I guess time are changing and that is promissing for more funnier development!

That kind of project should be show to univ and teachers, that should be easy to reproduce and source of inspiration in robotics fields ! Nice job !

1 Like

I’ve tested optical flow to 100m and it works pretty well. I don’t know what the altitude limit is but it’s higher than that!

1 Like

May I know if the position locking is based on GPS or non-GPS?

May I know where is Benewake TFS20-L Lidar connected to? RPI5 CM or CubeOrange+.
Am I correct, the optical flow algorithm on BlueOS takes in the camera image like the PX4flow into Ardupilot?

2 Likes

Hi @Jai.GAY,

Thanks for the questions.

In the optical flow test videos the position estimation is using optical flow. So for example in the 100m test the flight was done completely using optical flow only, no GPS.

The Benewake lidar is connected to the autopilot. The setup is pretty much described on the AP wiki but I’ve got some updates to the wiki page coming to show the wiring for some of the newer benewake models that aren’t mentioned yet.

Yes, the BlueOS extension for optical flow grabs images from the ethernet connected camera gimbal, runs the OpenCV optical flow algorithm and then sends the results to the autopilot using MAVlink’s OPTICAL_FLOW message.

But based on the specification, it is only capable of measuring up to 20m, so, for Ardupilot to lock position by optical flow, my understand, it needs lidar + optical flow sensor. How is Ardupilot lock the height from 20.01 m to 100 m?

May I know what is the video resolution you have used to conduct the test? Arducopter optical flow parameters, what is different from the direct connection to a optical flow sensor?

Hi @Jai.GAY,

Sorry, I forgot to explain this important detail. It uses AP’s terrain database when the vehicle flies above the lidar’s range. The code is in this PR which I hope will be merged within the next couple of weeks.

I think the camera gimbal’s resolution is 1920 x 1080 but we crop it so we only use the inner 60% of the frame. This chops out most of the distorted part of the image and improves the update rate.

1 Like

Amazing work, @rmackay9!

I’m curious—how are you planning to communicate with the companion computer while the drone is in flight? I’m working on a similar project using a Herelink system, and I’m considering using the Herelink’s Ethernet interface to connect directly to BlueOS onboard the drone.

1 Like

Hi @Fonseca,

Thanks!

During this test I used an mRobotics 900mHz Sik radio but there is also an LTE modem attached (see the picture at bottom of the original post) which allows the mobile phone network to be used. This is attached to the companion computer directly.

Using a herelink connected to the Ethernet switch is also a fine idea and we’ve added an “Ethernet Connectivity” section to the AP Herelink wiki page. The only downside to using the Herelink’s ethernet for video streaming is that this bypasses the Herelink’s built-in feature that reduces the video’s quality to keep the frame rate up and not overwhelm the mavlink telemetry data. Hopefully we will eventually get this feature added to BlueOS as well.

I guess your suggestion is to connect the Herelink to the companion computer directly instead of an ethernet switch. This might work but I’m unsure if the autopilot and companion computer could communicate with each other over ethernet (through the herelink’s ethernet)… I suspect that won’t work but I don’t know for sure.

1 Like

Thanks for the detailed response, @rmackay9!

I’m still in the research and testing phase, trying to figure out the best architecture. Your insights are really helpful.

Just out of curiosity—when you’re connected via the 900MHz SiK radio, are you able to access the BlueOS web UI or any specific ports/services running on the companion computer? Or is that link strictly for MAVLink telemetry?

I’ll sketch out a diagram to share what I’m planning—it’s mostly centered around using Herelink Ethernet to bridge video, telemetry, and access to the BlueOS interface.

Thanks again!

Hi @Fonseca,

When I use the mRo SIK radios, the radio is connected directly to the Cube’s serial port and the radios only support sending a single serial stream so communication with the companion computer is not possible.

The connection method you’re suggesting is OK but personally I prefer to connect everything using ethernet instead of a mixture of ethernet, serial and HDMI. Whatever works is fine of course.

1 Like

Thanks again, @rmackay9!

Sorry if I’m a bit slow to catch on—this is all pretty new to me, so I’m still wrapping my head around the different ways to set things up.

Just to clarify: is the LTE modem the only way you’re able to directly access the network on the aircraft (e.g., to reach BlueOS or the companion computer remotely), or are there any other options you’ve used successfully for that kind of access?

Appreciate your patience and the help—it’s been super valuable!

@Fonseca,

On my vehicle, to access BlueOS (which runs on the RPI5 companion computer) either the LTE network can be used or the RPI’s wifi (which is setup to automatically connect to my house’s wifi network).

1 Like

Hi @rmackay9

The altitude until which it works also depends a lot on the how many good features one is able to capture, right?
Also, do you ensure that the image frames are timestamped correctly? Is this necessary?
Also, did you try way-point navigation with this?

Hi @teju,

Yes, for optical flow to work well it needs to be able to see the ground clearly.

Re timing, I’m pretty sure that the timing is not perfect but it’s good enough it seems. The optical update rate is about 26hz and I’ve set the time lag to 100ms because I’m sure this is not quite right because I do see the vehicle bouncing back a bit when it comes to a stop.

Re using waypoint navigation, yes, actually most of the videos are using Auto mode (precision landing using Auto, High altitude optical flow in Auto).

Great work ! Is optical flow calibration on the ardupilot side required for this setup?
I would like to know what internally happens in this calibration procedure, i beleive its something related to computing time delays using flow and gyro data, but i am not able to find explaintations on this.
Also would this work with a fixed nadir camera using the same pipeline (sending flow_x,y and flow_rate_x,y instead of compensated flows)??

Hi @Amith4504,

Thanks! The regular in-flight calibration will not work with this setup (which uses a gimbal-mounted camera). The regular in-flight calibration relies on the camera being rotated (with the vehicle) and then uses the known rotation rate (provided by the sensor or the autopilot’s IMU) and compares it to the flow sensor’s output to arrive at the correct scaling parameters.

The calibration for this setup would need to be done more manually and would likely involve flying the vehicle a short distance, and comparing the estimated distance flown to the actual distance. The same FLOW_FXSCALE, FLOW_FYSCALE would be adjusted.

Re using a fixed camera with BlueOS, yes I think it will work. You will just need to be sure to NOT check the FLOW_OPTIONS bit called “Stabilised”

Hey, it looks great!

Is it possible to change the EK3_FLOW_DELAY and verify in the log if the optical flow graph is aligned with the IMU? How can we find the correct EK3_FLOW_DELAY value?

When I switch to optical flow mode with my own implementation, I get the toilet bowl effect, and after a few seconds, it switches to ALT_HOLD. I think it’s because of the EKF variance error caused by the EK3_FLOW_DELAY, but I’m not quite sure.

Similar to your case, I can’t be sure. How can we verify this? thanks!

Hi @Roy_Amoyal,

Toilet bowling is normally caused by the yaw being incorrect by more than 45 degrees so instead of the delay, I suspect the orientation of the flow sensor is incorrect. Perhaps it’s rotated by 90degrees.

Re measuring the flow sensor delay, yes, comparing it to the IMU (which has essentially no lag) is the best way. The trouble is that IMUs measure acceleration and the flow sensor measures (essentially) velocity so the comparison is difficult. I think the best way would be to:

  1. set the EKF to use the GPS (e.g. EK3_SRC1_POSXY = 3, EK3_SRC1_VELXY = 3)
  2. point the vehicle to the North
  3. move the vehicle smoothly left and right (e.g. West ↔ East)
  4. download the logs and graph the the EKF’s velocity (look for a log message starting with NKxx and field VE) and the flow sensor’s raw rate (e.g. FLOW_xx). Both values should appear as a sign wave so it should be possible to find points on the two graphs that correspond to each other (e.g. both peak) and then find the individual NKxx and FLOW messages and compare their TimeUS values

Sorry to be slightly vague on the log field names

2 Likes