VISION_POSITION_ESTIMATE not appearing in QGroundControl

It is continuous, from what I see.

For now only Loiter, when I will have the time I will try with /mavros/setpoint_position/local

1 Like

@anbello what is the size of aruco marker you are using?

16cm, I tried also with smaller sizes (12cm) but the estimation is worse. I think it could be better using a board of marker:
http://docs.ros.org/kinetic/api/asr_aruco_marker_recognition/html/classaruco_1_1BoardDetector.html

Thanks for your response. I am using markers sized 14.1 cm and the estimation is very noisy at heights greater than 60 cm. I am going to try with markers with sides of length 30cm and see if it works

@anbello, I tried with bigger markers with sides of length 45 cm and the results look little better but there is considerable toilet bowling effect as mentioned by @chobitsfan.

Here is a video: https://youtu.be/D0vMxI3H5BY

There is some description in the video but I can write the same thing here too:

The pilot(me) takes off the drone(using the remote control) in ALT_HOLD mode to a certain height so that the Aruco markers are well visible and the external navigation starts. Then switches to the LOITER mode. There is considerable toilet bowling effect and things go bad when wifi transmission of images is momentarily lost probably due to poor quality of wifi module over skyviper. When that happens the pilot switches to ALT_HOLD mode to stabilize the drone and brings it over the aruco markers, then again switches to LOITER mode when wifi connection is re-established. The topics mavros/vision_pose/pose(RED) and mavros/local_position/pose(GREEN) follow each other closely, more so in Z direction but the trends in X and Y are also similar. I believe that the controller should be tuned now so that the toilet bowling effect is reduced. I am using Arducopter-3.7-dev. Any help with that will be deeply appreciated. I am adding some graphs here:

The graph below is for X, there is toilet bowling and also very noisy estimation from aruco markers, I dont know if toilet bowling happens because of noisy estimation or any other thing.

Here is Y and it looks more or less the same as X:

The Z looks OK:

The Roll and pitch are noisy but centred around zero(ofcourse).

I dont understand why Yaw o/p from EKF (NKF) and Vision(VISP) are off by 90 degrees. Any insight on that will be praiseworthy.

I also attach here the associated data flash log. Here it is: https://drive.google.com/file/d/1eA1Bt_E1m6UZMvTBh7ziBCwM_GESkCPn/view?usp=sharing

Requesting @rmackay9 @chobitsfan and @vkurtz to kindlly take a look and give me their invaluable feedback.

@Subodh_Mishra,

I suspect that the instability is due primarily to poor tuning of the inner loop (rate control, attitude control) PID controllers.

I ran into similar problems while using a similar setup with a mocap system for localization. Here is what things looked like on our 3DR Iris using the default parameters. You can see a similar toilet bowling effect in the position (e.g. POS.TPX vs POS.PX). While the attitude tracking (ATT.DesRoll vs ATT.Roll) is okay, there is a lot of noise.

Here is a log from the same system using these tuned parameters. As you can see in the log, the results are much better and the quad felt very stable in the air. Note that the attitude tracking (ATT.DesRoll vs ATT.Roll) is much tighter than in the previous log or the log that you shared.

Unfortunately I donā€™t have much experience tuning these inner loop controllers, though there are some instructions here. In fact, we have another older custom quadrotor in the lab that Iā€™ve been trying to get this system to work on, but I havenā€™t been able to get the same good results with. If you are able to tune your system and get good results, I would love to hear how you went about the tuning process!

2 Likes

As far as the strange yaw offset goes, Iā€™m not sure why thatā€™s there, but I have similar results. I donā€™t think thatā€™s the problem that leads to toilet bowling though, since I would expect much worse behavior if there really was a 90 degree shift.

1 Like

Thanks @vkurtz for your response. I will see if I can tune the innerloop and fix this.

Hello,
Iā€™m new here, so let me introduce quickly myself. Iā€™m interested by the vision method for localization (e.g. SLAM). My project is to make navigation in indoor environnement for drones, and as I suppose you know, there is big challenge about it !

Iā€™ve tested AprilTag (which are visual marker similar to Aruco) as only source for positioning system. And I had the same situation of big noise in thoses axis.
From what I understand, this method has a major problem : When the camera is coplanar to the tag (especially when the camera is in front of the tag), there is a situation of " singularity " generating this noise.

The library Aruco v3 has an ā€œestimationā€ function which tries to reduce this phenomena - it is remembering the last pose for the next one in order to avoid the noise (if you need, I can provide you the reference). It is better, but the noise is still here (even with adding filters).

For me, Iā€™m moving for another solution. Keeping in mind that AprilTag/Aruco are great and cheap method, but will not be used for unique source of positionning in the outter loop.

I hope that you have found a way to fix it.

Hi
With a board of marker instead of a single marker there is way less noise.
http://ardupilot.org/dev/docs/ros-aruco-detection.html
Anyway aruco_gridboard use aruco lib from opencv3 that is aruco v2 and from what I read from the aruco site v3 if better and faster.
Also UcoSLAM is really interesting, a SLAM library that uses both keypoints and aruco marker.

I have seen the amazing project of UcoSLAM, and seems to be a nice way to merge Aruco and SLAM (including monocular and stereo), improving at the same time ORB-SLAM2.
The only bad point, is that UcoSLAM Cmake is not design for ARM architecture.
Precisely, the library designed by the team of " rmsalinas " called Fbow is only for Intel processor (like x86)

Iā€™ve been in contact with him, and if I have time, I will try to make it compatible with ARM. And, cherry on the cake, see if it can be optimize for GPU (like ORB-SLAM can do with CUDA).

It would be great if you succeed with this porting and if you keep us informed.
Thanks

Yes, I will let the community follow the evolution.
For now, Iā€™m trying to use SLAM from the ZED Stereo Camera and use the /local topics from mavros (instead of /global ones, which needs GPS) as primary source of positionning. And then, if itā€™s working, using UcoSLAM.

Iā€™m facing some issue, but I will create a specific thread about it if I donā€™t find a solution, because I think you have more experience than me for this integration.

In any case, I have just join this forum, Iā€™m following now the gitter ā€œhttps://gitter.im/ArduPilot/VisionProjectsā€, so I will be more active to participate of this research field :slight_smile: This forum seems to be dynamic, and I do appreciate it !

1 Like

Cool, that looks a little like the Aruco Map
http://wiki.ros.org/aruco_mapping

I tried static and it works OK , cannot experiment inflight as it require a quite some space (volume)

@anbello I think you had that one on your test suite a few month ago:

With aruco_mapping each marker in the map is detected as a single marker and suffer from noise so I passed to aruco_gridboard that has better pose estimation with less noise

Agree, but this is just a non filtered localization system, as it it missing an appropriate estimator like a particle filter. The in the system above is based on ORB-SLAM , making it a good candidate for implementation.

If you are talking of UcoSLAM I agree, it uses both features points and ā€œmarkers to enhance, initialization, tracking, and long-term relocalizationā€

I would like to use it but if we want to use in ROS we have to write a wrapper that does the image tansport stuff, call functions from the library and subscribes and publish topics.

It could be done but I have really little spare time to do this kind of things.

Yepā€¦ So many projecst and not enough time ā€¦
I guess that just the publisher is required as the Video could be a Gstreamer pipeline.
And the original wrapper is here : https://github.com/ethz-asl/orb_slam_2_ros

@anbello : In order to arm the copter, as I understood, you use the ā€œALT_HOLDā€ mode, and then you go back to ā€œGUIDEDā€.

For now, Iā€™m trying to do the same (with Gazebo simulation, removing all the GPS), I can arm on ALT_HOLD, but then it is impossible to change the fight mode into GUIDED.
ā€œAPM : Flight mode change failedā€

And, of course, if I try to arm in GUIDED, I have the 3D fix error.

You had the same problem ? If so, how have you done it ?

When I control the copter with Joystick I arm in Loiter, when I control it with python script I arm in Guided, but I can do so only after I begin to send the vision pose estimation messages to Arducopter, so its is like if it have 3D fix.