Inner workings of optical flow based position hold


I’m setting up a system where an on-board computer (in this case, Intel Aero Compute Board) is flying a pixhawk. I’m trying to implement position hold using optical sensors in a gps denied environment. My problem is that I don’t know what to pass to the pixhawk, whether it’s the x,y,z,w information or row, pitch and yaw.

Perhaps asking how the position hold mode works on the Pixhawk is a good place to start. For example, if I were to use PX4FLOW sensor, I can look at the OF output in the log by looking at OF.flowX, OF.bodyX and IMU.GyrX (same for Y). Ideally, OF.bodyX and IMU.GyrX should match closely. Out of these signals, which ones are provided by the PX4FLOW? and which ones are generated by Pihawk itself? If I were to introduce the x,y,z,w information from my computer, which signal should I replace?

Has anyone tried this type of setup? What are your thoughts and experiences?