Servers by jDrones

Position Hold with VISION_POSITION_ESTIMATE and ATT_POS_MOCAP for indoor localization

I am attempting to hold position indoors using a motion capture system. Following this thread, I can send VISION_POSITION_ESTIMATE and ATT_POS_MOCAP mavlink messages containing localization information to the flight controller (pixhawk 2.1, firmware compiled from master a few weeks ago). The messages appear to be properly received and processed, as per this recent update.

However, flight in position hold mode is unreliable. Inspecting the flight logs shows that the EKF’s position estimate generally follows the given location, but not very closely. The difference is especially notable in yaw.

Is there any way to get the EKF to “trust” the localization messages more, especially with respect to yaw? Has anyone else had success using these message types for position control?

log_2 (ATT_POS_MOCAP message)
log_3 (ATT_POS_MOCAP message)

1 Like

I managed to fix the yaw issue by disabling the compass:


However, the response of the EKF still seems sluggish, as shown in this video (true position from the mocap system is on the small axes, EKF position on the large axes). When the copter is fairly still, the EKF pose estimate looks good. When it is moving, the EKF estimate lags behind the ground truth, which I suspect leads to poor position hold behavior.

Log: 99-12-31_19-24-58_1.bin
rosbag: 2018-06-06-09-48-41.bag (used to make the video)

@rmackay9 @jeff_null @Subodh_Mishra any ideas? Has anyone successfully used these external position messages for position hold/navigation?


Hi Vince,

This is really great feedback, love the video. Sorry for the slightly slow reply.

So I’m pretty sure what we are seeing here is the lag between the vicon system’s messages reaching ArduPilot. This is something that we don’t take into account (yet) but we intend to add it. We actually already process the TIME_SYNC messages but we don’t use the results when pushing the external position estimate into the EKF.

I think there’s two things we can do:

  • short-term we can add a parameter to allow you to hard-code what the lag is. This will allow us to be sure that this is really the issue
  • complete the integration of the lag estimator and replace this lag parameter mentioned above.

If this sounds good, I’ll whip up a new binary on Monday for you to test with. So I guess you’re using Copter-3.6 on a Pixhawk/Cube board?


Testing the lag sounds like a great idea: I’d be happy to test a new binary. Copter-3.6 and a Pixhawk/Cube board is correct.

To add a bit of complication, I was able to get marginally better performance by changing a few parameters, as shown in this new video (true position on small axes, EKF position on large axes). After playing with several other parameters that didn’t seem to have much of an effect, I set EK2_GPS_DELAY 0, EK2_POSNE_M_NSE=0.1, and EK2_POS_I_GATE=1000. The performance is better than before, but position hold still drifts as much as a meter, which is quite a bit worse than I would like.

Looking at the code, it seemed like at some point the position information I was sending was treated the same as position information from a GPS, which was the inspiration for the above changes. I’m not very familiar with the EKF code however, so I could be wrong. If it is the case that EK2_POS* parameters are involved in the processing of external navigation information, their descriptions should probably be updated since they currently only refer to GPS.

Old parameters (from the very first message): ext_nav_messages_working.param (16.0 KB)
New parameters: ext_nav_position_hold.param (16.1 KB)

Log: 2012-31-1999.bin
Rosbag: 2018-06-07-17-57-35.bag

Hi @vkurtz,
I navigate in an Indoor environment using HTC VIVE and I get accuracy of ~5 cm
I just override the position output from the EKF with the position I get from the tracker

These are the functions I override:

But you should be aware that these changes allow only the tracker to decide position without the fusion of other sensors


1 Like

@JustFineD - thanks for the information! I’d definitely prefer to fuse information from other sensors, but I’m glad to know that overriding the EKF position is a viable option if needed.

For anyone interested in testing the external navigation messages in simulation, I set up a demonstration using SITL, Gazebo, and ROS here. The results match what I’ve observed in real life fairly well.



@chobitsfan found and fixed a small bug in our mocap consumption code. This is in master now so if you’re using the mission planner you can load the “latest” binary by opening the Initial Setup >> Install Firmware screen, press Ctrl-Q and then label should change to “ArduCopter V3.6-dev” and then click on the Copter icon and it should install as per usual. Alternatively you can download and install the firmware from the central server (use the -v3.px4 binary).

I’ve also created a binary which inclues chobitsfan’s change and also adds a VISION_LAG parameter which allows setting the lag in milliseconds. This value should never be set to a number > 2000 (i.e. 2seconds). Maybe you could try this and see if changing the lag affects performance? I.e. 200 for 0.2sec lag might be a reasonable number.

Just FYI, the changes I made to add this lag parameter are here.

@rmackay9 This PR adds the lag parameter, but I cannot see where it is processed as a delayed state within the EKF , Am I missing something here?

Hi Patrick,

It passes the lag into the EKF by adjusting the timestamp. so we subtract the lag from the timestamp and pass it in. This is the commit that adjust the timestamp.

1 Like

@rmackay9 ok I see, other question , is Paul wip on EKF3 is complete or we still have to use EKF2?

Copter fly in circle mode using current master mentioned by @rmackay9 , parrot bebop using ardupilot, optitrack motion capture system. use ATT_POS_MOCAP to send external nav data to copter through wifi. COMPASS_USE=0, COMPASS_USE2=0, COMPASS_USE3, EK2_GPS_TYPE=3


This is fantastic! Can we blog this on It’d be best if it was you but I’ll do it if you’re too busy.

The EKF3 changes still need testing, peer review before going into master. I’m not sure of the timing but I think @priseborough has it in hand.

Can we blog this on It’d be best if it was you but I’ll do it if you’re too busy.

Hi @rmackay9 Thank you. It is done here. I am not a native speaker of English, so I am sorry if my article is difficult to understand.

1 Like

Really great Chobitsfan, thanks!

@vkurtz I’m interested in testing external nav messages in simulation. I cloned the ardupilot_gazebo package from your link into the src dir of an existing catkin workspace, ran catkin_make, but I am unable to do “roslaunch ardupilot_gazebo mavros_opitrack.launch”. It looks like ROS won’t recognize the package (won’t tab complete, etc.) Any ideas?

Have you run the following?

source [catkin_ws]/devel/setup.bash
source /opt/ros/kinetic/setup.bash

I believe that’s necessary to properly initialize a new package

@vkurtz Yes, ran those and still encountered the issue. However it’s resolved now, I ran rospack list and that seemed to get ros to update its package list. Thanks!

Here is a video showing my results in Arducopter SITL, flying in guided mode using the VISION_LAG parameter binary. 250 ms seems to be the sweet spot for me. I am also noticing variance from the set position, mostly in the roll axis. I have followed @vkurtz’s parameter recommendations. Posting this in case anyone has additional recommendations to get the quad to track the external localization input more tightly.

Servers by jDrones