I’m trying to use a Jetson Xavier to do optical flow using a USB camera. I have it computing the flow field at ~28fps, taking a 100ms rolling average and publishing MAVLink messages to the Pixhawk.
The Jetson reports its time to compute the optical flow as 22ms. The logs show OF.flowX lags behind OF.bodyX by about 250ms.
I tried to set EK3_FLOW_DELAY to 250, but QGC said the max value was 127, so that’s what I set it to.
The drone drifts away as soon as I switch EKF source set to 2, but doesn’t raise any EKF errors or anything (fuse velocities is disabled).
Some questions:
Is 25fps enough for a 1200mm, 15kg octocopter?
Is 250ms of latency catastrophic? Can EKF3 deal with the latency?
The flow/body curves don’t move when I change FLOW_DELAY. I take it that these plots are the raw data, not adjusted for the delay parameter. Is there some way I can know if a value of 127 is helping?
Is this latency real?
Any other tips to make it work?
The picture on the Arducopter wiki has OF.flowX and OF.bodyX lining up perfectly in time. Does PX4Flow really capture, compute, average and transmit in such an infinitesimally small time that it appears to have 0 latency, or is this filtered by the somehow?
So you are using PX4flow (CMOS) and not optical flow sensor (e.g. Hereflow)?
If yes, I didn’t achieve good results with PX4Flow. The calibration is tough and flight performance result is not guaranteed.
Oh, sorry for the confusion, I’m not using PX4Flow nor Hereflow. I’m using a USB camera and doing the Optical Flow calcualtion on the Jetson. My reference to Px4Flow applies equally to Hereflow. Does it really calculate the flow field and communicate it over Mavlink with near 0 latency as it seems in this graph? from here by @rmackay9
I think the Hereflow (and other sensors) have very low lag (less than 100ms).
A few answers:
25fps is definitely fast enough
250ms is the maximum the EKF can handle. The lower you can get it the better.
There’s a request somewhere to change the FLOW_DELAY so that it can accept 250ms, this is an easy change actually
yes, the plots are the raw data and are not adjusted for lag. Normally it’s these logs that are used to roughly estimate what the lag is
I’m surprised you’re seeing 250ms of lag but it’s probably all on the jetson. Maybe it’s not flushing the serial buffers? I’m not sure, just a guess of course.
BTW, we’ve written a BlueOS extension to provide optical flow and it’s blogged about here.
I find some calibration params for my copter with px4flow, and the max height was 60m, and radius of drifting 10-15m, but also I found that some px4flow didn`t send gyro data by mavlink, if you are using uart. I2c is working correctly.
What are you using for this? OpenCV? When I tried this with OpenCV (python3, pymavlink) the latency was visible in the autopilot graphs, but it worked(latency 0.04 - 0.02s).
Also px4flow had smaller latency.
Thanks for the helpful answers Randy!
I wish I had have seen the BlueOS blog 3 weeks ago -_-
I suspect mavlink-router might be introducing the latency. My Jetson Nano with a direct serial connection (USB) squeaks in just under 125ms latency, taking 60ms to computer the optical flow with CPU.
The Xavier using GPU takes 22ms but is connected to mavlink-router UDP, before going to the pixhawk.
Are there any known ways to reduce mavlink-router’s latency?
@VasilkovskiiVS yes I’m using OpenCV with Cuda GPU support in a docker container
Ok yeah it was mavlink-router introducing a lot of latency.
Connecting directly to the serial port, I got the latency down under 120ms.
Any idea why it is oscillating with increasing amplitude like this?
This plot shows flowX being a little smaller in amplitude than bodyX, but I adjusted it in a later log to match much more closely and it is still unstable.
Also @rmackay9 the parameters FLOW_FXSCALER and FLOW_FYSCALER don’t seem to have any effect - I’m guessing they only apply to I2C connections, and not MAVlink?
I think the FLOW_FXSCALAR and FLOW_FYSCALER should have an impact even on sensor values received through MAVLink (the AP-4.6 code is here)
By the way, for the BlueOS extension, it requires that AP-4.7 be used because it supports flow sensors in a camera gimbal and also the more accurate flow_rates_x and flow_rates_y fields from the OPTICAL_FLOW message (PR that added these features is here)