VISION_POSITION_ESTIMATE not appearing in QGroundControl

Andrea,

I’ve also seen that message on a colleague’s PC when using ROS and I think it should go away if the TIMESYNC and/or SYSTEM_TIME messages are being transferred between AP and ROS.

I don’t know for sure and I think it should work even with that message appearing but I hope we can make it go away eventually. Maybe @peterbarker has some more input.

In the mavros sys_time plugin there is handle for mavlink TIMESYNC message:

On the relative wiki they say that this is only for PX4:
http://wiki.ros.org/mavros#mavros.2BAC8-Plugins.sys_time

But I see that also Ardupilot has handle for mavlink TIMESYNC message:

Anyway I am not able to go further on this front so for now I will continue with my Aruco vision_position_estimate tests ignoring the warning “TM: Wrong FCU time”.

Thanks
Andrea

1 Like

Hey guys… I am using the following param set:

AHRS_EKF_TYPE 2
EKF2_ENABLE 1
EKF3_ENABLE 0
GPS_TYPE 0
AHRS_GPS_USE 0
VISO_TYPE 1

to do vision based state estimation. The pipeline is complete, ie. I am sending data in /mavros/vision_pose/pose and getting an output in /mavros/local_position/pose but these topics are not same at all. So, I think there is an issue with the frame of reference of both these topics. For my setup the /mavros/vision_pose/pose comes from the tracking a single aruco markers and the result in the topic is the position of the drones camera in the marker’s frame. How can I get similar results in both the aforementioned topics, in other words , how can I get input topic = output topic.


When I say the topics aren’t same, I mean that the data contained in them is not same.

From what I have understood VISO_TYPE 1 is only necessary if you use the VISION_POSITION_DELTA message with EKF3, to use the VISION_POSITION_ESTIMATE message (with EKF2) should be necessary EK2_GPS_TYPE 3 and COMPASS_USE 0 in addition to the parameters to choose EKF2 and GPS_TYPE 0

1 Like

Hey @anbello thanks for your response. But can you be a little more clear and specific? Can you please tell me the parameters you have set?

Hi @Subodh_Mishra
I have the following parameters on my quad:
AHRS_EKF_TYPE 2
EKF2_ENABLE 1
EKF3_ENABLE 0
GPS_TYPE 0
EK2_GPS_TYPE 3
COMPASS_USE 0
VISO_TYPE 0

my quad stream the video from a camera via WiFi like your, only it is DIY:

  • little quad (130mm) with RevoMini FC with Arducopter 3.6 rc7
  • on the quad there is a Raspberry Pi 0 with Raspberry Cam streaming video to Desktop PC
  • on Desktop PC there is ROS Kinetic with mavros and aruco_mapping
  • the video is streamed with gstreamer and on the PC the node gscam get this stream and publish camera/image_raw and camera/camera_info topics
  • on the PC aruco_mapping (slightly modified by me) subscribe to above topics and publish a camera_pose message to the mavros/vision_pose/pose topic.

[Edit]
I forget to say that i send a SET_GPS_GLOBAL_ORIGIN and a SET_HOME_POSITION messages (similar to this) before sending vision position estimate.

When all this is running and I have the quad on my hand (disarmed) with camera pointing to a marker I can see /mavros/vision_pose/pose and /mavros/local_position/pose that moves in a coherent manner.

When I try to arm and fly over the marker if I switch to Loiter mode the quad drift and is not stable.

One problem I see is a not so good video quality and when I fly over the marker the marker itself is detected intermittently and not in a smooth an continuous manner as needed. I have to work to improve the quality of the streamed video and perhaps to redo a camera calibration procedure.

Another thing that I did not understand well is if I’m using the correct transformation when I send the message /mavros/vision_pose/pose, I am talking about the ENU NED stuff as explained here under " Asserting on Reference Frames".

Thanks @anbello, I am going to try this and get back to you in a day or two.

Hey @anbello, Do you arm your quad using the controller/transmitter or in using ros commands for mavros?

Don’t you think this should be set to 3, which means no GPS?

Yes it’s a typo I mean 3
now I edit the previous post with the right value, thanks

1 Like

I arm with the controller

1 Like

What mode are you in when you arm your quadrotor?

I arm the quadrotor in ALTHOLD mode

1 Like

Thanks for the response @anbello. Even I arm it in the ALT HOLD mode., But I use Qgroundcontrol to do this. After arming (actually you can do this without arming) I give the takeoff command from Qgroundcontrol and then the motors start working(I have removed the propellors for safety) as if its trying to takeoff and the same time my mode automatically switches to guided mode. Does it happen with you?

I had one more doubt, what did you mean when you said " with camera pointing to a marker I can see /mavros/vision_pose/pose and /mavros/local_position/pose that moves in a coherent manner."? Do these two topics publish exactly the same values?

Not exactly the same values but near values.
The values coming from /mavros/vision_pose/pose varies quickly (little errors in the estimation from aruco marker), those coming from /mavros/local_position/pose are smoothed from the EKF.
When I will have the time I will post a video to clarify better.

1 Like

Thanks @anbello. Please do! For me these values are different. I dont understand why.

Subodh,

It’s normal for the vehicle to switch to Guided when the takeoff command is issues from QGC. It’s actually QGC that is commanding the vehicle to go into Guided mode.

1 Like

Here a video of a test of my system:

After starting video streaming on Raspberry Pi 0 on my quadcopter (connected to desktop PC via WiFi) I started ROS, aruco_mapping and mavros on PC and moved the quadcopter over the marker with my hand.
The two axes systems represent the /mavros/vision_pose/pose and /mavros/local_position/pose topics, the same topics are echoed on terminals.
The /mavros/vision_pose/pose move quickly and with more disturbances respect /mavros/local_position/pose that is smoothed by EKF.
The two poses have enough near values, only when I rotate yaw left respect to initial heading I see near angles in term of RPY but different values for the quaternions (there are two values of quaternions that give the same RPY angles).

2 Likes

Really interesting!! I am planning to achieve something like this. I think I am close but not quite there. I think I need to use the aruco_mapping software instead of my own.

@anbello I am going to post my video too. Looks somewhat similar. One thing I wanted to know is how and where do you feed the extrinsic calibration b/w the camera center and the the FCU center(or IMU center)? So that these two aforementioned topics can coincide. My result shows some offset betweem them. I also direct this question to @rmackay9. Thanks!