Need 3D Fix in NON-GPS Guided SITL

Hi,
I’m new here and I hepe to do things correctly. I’m trying to run a SITL in guided mode in GPS denied conditions of a copter, but at the end of all i always notice the error FCU: PreArm: Need 3D Fix and it doesn’t allow me to arm motors.
Just one consideration: I know that is unsafety relying only on IMU to get the state of a drone, but this has to be ssen only in the contest of a simulation and later there will be the integration of sensors to get robustness.

I’ve set the parameters in order to disable also the GPS and pre-arm checks:

## FRAME
FRAME_CLASS	1
FRAME_TYPE	1
## EKF
AHRS_EKF_TYPE	2
EK2_ENABLE	1
EK3_ENABLE	0
## GPS
GPS_TYPE	0
EK2_GPS_TYPE	3
VISO_TYPE	0
COMPASS_USE	1
AHRS_GPS_USE	0
## Arming
ARMING_CHECK 	0
ARMING_REQUIRE 	0
## SIM
SIM_GPS_DISABLE 1
SIM_GPS_TYPE 	0

About these parameters I have 2 doubts. The first is that ARMING_REQUIRE does not exist (even if it is present in the full parameter list). The second is that it’s not clear the difference beetween SIM_GPS_DISABLE and GPS_TYPE.

Anyway after making sitl starting (sim_vehicle.py -v ArduCopter --console --map) and also mavros, I changed in guided mode and finally set Home (with height) and Origin (with height) on MAVProxy map.
I post here the log of mavros on terminal: mavros_log.txt (14.6 KB)

I guess I’m missing something foundamental but i’m not sure what.

Try this under mavproxy:

arm safetyoff
arm uncheck all
arm throttle

This should disable all the safety checks and arm the throttle.

I’ve tried but it sill doesn’t work unfortunately, showing always the same error about 3D fix

Hi,

Can you fly in stabilized mode using the simulated RC? If not then there is more issues than just the 3D fix.

It might be that ardupilot is clever enough to not let you fly in guided mode only on IMU data. If your “robust position estimate” scheme is not made yet then you can try the sitl optical flow, which is quite quick to test. Note that you also need a range sensor.

Yes, I’ve checked only to arm and it works in stabilized mode

It arms correctly with a virtual optical flow and range finder as you suggested me, in guided mode without GPS. So I’m starting to think that ArduPilot won’t allow me to flight with only IMU data as you told me.
The foundamental problem is that I’d want Ardupilot receives command inputs from an external microcontroller as Jestson Nano (processed sensor data), for that reason I’d want to disable all ardupilot features so as not to conflict with them. The purpose is a simulation on gazebo of an autonomous drone fed by external navigation data. It could be possible in practice?

NEVER disable your arm safety checks, they are there for prevent any kind of accident.

For None GPS flight, you can use serveral solutions, the optical flow or a indoor reference system, like @Marvelmind sensors.

1 Like

You can use ardupilot for gps free navigation, but then you need to send your position estimates to the flight controller. I.e. Ardupilot needs to know your position, even if you calculate it on the companion computer and wanted to send motion commands that account for the known position. Without knowing your use case, I guess the feasible approach is to leverage the Ardupilot features(eg. position controller, velocity controller) instead of disabling them and then Ardupilot needs your position estimate. If you want to make your own high level contollers and send attitude targets then you can have a look at GUIDED_NoGPS.

I have not found a general description of how to do send the position data you calculate to Ardupilot, but the docs contain some examples(ex1, ex2) where they send updates with the VISION_POSITION_ESTIMATE mavlink message. There is also some blog posts on this forum addressing this issue(ex1, ex2).

I have not tried any of these myself, but it would make a very cool project.

1 Like

Thank you so much, the last two links are suitable to my case. I’ll try to follow them and updating if it works

1 Like

Latest updates. I managed to build a simulation in Gazebo in GPS denied used the material in the blog Integration of ArduPilot and VIO tracking camera as reference. As external source of pose I used a sdf model of the t265 at which i attached the p3d plugin to generate the pose message as odometry, since I did not find anything else similar. Then I passed that message to a simple node that transform the structure in PoseStamped and bradcast the t265 pose. Configuring the parameter list mentioned above, all works properly and it is possible to command the drone through mavros. The last point I left uncovered is the allignment of the camera body to the ENU world frame Y axis of ROS

1 Like

Cool! Awesome that it works!

Will you try it on a real drone too?

Yhea, we are doing that right in this period. We are using a little bit complex setup with respect to the original one, using a t265, d435, ultrasonic sensors and imu of course. The simulation was of course for integration purpose of all the developed code including slam, path planning and a little bit of machine learning in order to test all in safety before deployng on the real drone. At the end we will make the material open source in some way, including a mod of the liftdrag plugin for gazebo and a script for keyboard teleop with mavros

Do you have any updates? Thanks