Optical flow in Pixhawk with arduCoper

Hello All,

I’m trying to figure out optical flow feature.
My goal is to be able to track a certain element using computer vision, something like this:

i have search the forum/web and found a few sites that explain how to connect px4Flow and LIDAR to the pixhawk in order to be able to control the UAV(for example):

I’m trying to understand how does the PX4FLOW is communicating with the arduPilot and tell it how to operate ?(i’m not asking about i2c/ SPI/serial, i’m asking what data/frames the pixhawk controller is getting and what does it do with it?).

hope that someone can help me with some info about the process.

Thanks in advanced,

Px4Flow is used as a velocity sensor , it is sending vectors on the x-y movement because it is looking on features on the ground. These information are then added to IMU and other sensors (if available) and fused within the EKF to set the drone attitude.

So its low level attitude control loop , if your are looking for hi-level guided mode , you can read on this link : QuadCopter Object Tracking on a budget

If you need more autonomy, you need to go either SLAM or like in the video above with pretrained Deep Neural Network, that is actually the state of the art, but this is requiring a high level of expertise in both AI and Robotics.

Hope this helps


Thanks for the quick answer. Great video kind of a shame that the balloon blow up in the end :slight_smile:

What i’m interested is the operating commands to the FW(arduCopter), that makes it “decide” where to go next.
In the PX4FLow i can see that it uses Mavlink #106 #132 but i don’t see it in openMV(in your python code i think you do it in
send_nav_velocity) so how does the openMV does it?

Is there any other way to give this information to the FW(analog voltage/current)??

Thanks in advenced,

OpenMV sends x-y blob coordinates to the python script that translate these into MavLink navigation commands.
This methods is using high level control loop and give excellent results.