Integration of ArduPilot and VIO tracking camera (Part 3): Indoor autonomous flights and performance tests

Hello @LuckyBird
Great project!
I am doing almost the same (Indoor Flight) but in my case, I am using PX4 and QGroundControl. I have seen the code for flying the drone and I have simulated it into Gazebo but it is not simulating it’s only arming the vehicle and not takeoff. Any suggestion I will appreciate.

Hi @DiegoHerrera1890,

It’s hard to say what the problem is given the scarce information. My suggestion is to verify that everything is done correctly according to PX4 wiki ( Following hands-on guide (for example can be a good starting point.

I have upload in detail the issue I am facing here, May you please help me?

You are running on PX4 firmware , there is not much we can do unless you switch to ArduPilot

We have been successful to set this up and get it running OK, including right clicking in Mission Planner and selecting “Fly to Here”. Our problem is how to generate and run and indoor flight plan. We tried using the traditional MP flight plan that defines GPS coordinates and (we assumed) Ardupilot would treat these as vector references from the EKF home point. But the copter just wanted to auto land every time we switched to Auto mode on the Tx.
Any input is appreciated.

@LuckyBird any comment on this question? Can I use GUIDED mode and tell the Ardupilot to use VIO while I sent it movement commands over MavLink messages? Do I need to choose GUIDED_NOGPS ? will it still use the T265 pose data?

I’m trying to program an indoor autonomous drone with setup similar to what you described here.


@Raz_St you can use either GUIDED or GUIDED_NOGPS mode to command the vehicle. As long as the data from the T265 is ready and reliable setting up an indoor autonomous mission should work.

But keep in mind that the T265 might not always work reliably (sometimes the data doesn’t make sense, but there is still data) so having fallback measures (optical flow + rangefinder for indoor flight) is always recommended.

thanks for the reply @LuckyBird. We will try to use it in GUIDED mode.

You are correct that it is not always reliable. We tried a few indoor test flights today with the T265 + LiDAR rangefinder and with no GPS connected. Just using AltHold mode was fine for a few minutes but at some point it started we lost control of the drone (good that we have a safety net for these cases and nothing was damaged).

What did you mean when you wrote “fallback measures”? Is there a built in support for this in ArduPilot? that if there is no (reliable) data from ExternalNav (T265) than it uses the OpticalFLow? Or do I need to write this in the Python script to send data in the NavLink message from diffrent sources?

I’m trying to develop indoor autonomous drone with my high school students for a long time now, with no success.


1 Like

What did you mean when you wrote “fallback measures”? Is there a built in support for this in ArduPilot?

This is an active topic for further development regarding the T265. When there is a human pilot ready to take back control, “fallback measures” can be optical flow + rangefinder sensor that allows precise indoor flight in Loiter / Altitude mode, given that the pilot can react before the crash.

Obviously, we should aim for a more autonomous solution. Currently the T265 gives feedback on how reliable its data is in terms of covariance (see this Github issue for some basic info) and confidence level (HIGH/MEDIUM/LOW/FAIL), both of which are used by the Python script and Ardupilot (currently supported in EKF3, see here) to handle such failure cases better. For more details, I suggest looking into the EKF wiki page.

Nonetheless, the feedback data from T265 is discrete (4 predetermined values instead of a range of values) and support to extend it to the more general case is unlikely (see the end of the aforementioned github issue), thus there is a limit as to what extend can we cover the failure cases. For now, a safety pilot in the loop is still my recommendation.

1 Like

@LuckyBird Thankyou for this amazing blog! I really appreciated it!
I have some questions

  1. Why the autopilot rise the message BAD COMPASS HEALTH even if the yaw angle is provided by the tracking camera?
  2. Did you perform loiter fligth test without any other altitude sensor - relying only on T265?
  3. Did you test EKF3?


@LuckyBird blogs are the essence of this wiki

And as you can read.

For ArduPilot-4.1 (and later):

If you wish to use the camera’s heading:

If you wish to use the autopilot’s compass for heading:

  • COMPASS_USE = 1 (the default)
  • EK3_SRC1_YAW = 1 (Compass)
  • RC7_OPTION = 80 (Viso Align) to allow the pilot to re-align the camera’s yaw with the AHRS/EKF yaw before flight with auxiliary switch 7. Re-aligning yaw before takeoff is a good idea or loss of position control (aka “toilet bowling”) may occur.

Hi Thien sir
Im at the last step to go autonomous with T265 but when we are running t265_tf_to_mavros.launch
we get error
“camera_odom_frame” passed to lookupTransform argument target_frame does not exist.
and its keep echoing
Plus when trying to connect to mp,the MP glitch didnt happened and when we set the the location didnt change.The set home and EKF didnt respond (the quad symbol didnt apear on map)
but in mavlink inspector and in mini pc the confidence Lv keep showing movement and data streaming

that means the frame “camera_odom_frame” does not exist in the tf tree. Assuming you are using the realsense-ros driver, please check on that front to make sure the data is being published in tf and all the names are correct.

Without the data being published and ArduPilot receiving them, these steps would not work. Hence, the first problem needs to be resolved before this one.

Hi dears, I have some questions about the intel t265 camera.

  1. What are the conditions and limitations of this camera in the outdoor environment? (temperature, humidity, relability)
  2. What is the difference between using ROS and Non-ROS?
  3. Stream images have a specific use? (Raspberry Pi 3 or 4)
  1. You can probably find the info in the T265 datasheet. If the detail you seek is not available, you can post your query on the librealsense repo for Intel support.
  2. Mainly based on your reference. If you just want to have a drone up and flying, non-ROS is my recommendation. You can also ask for support more easily on this forum. ROS support is handy when your application has something to do with it.
  3. The image stream can be useful for other applications such as precision landing, target detection and tracking etc. The RPi3 does not have USB3.0 so obtaining the image stream is not very stable and might fault the pipeline sometimes.

Thanks @LuckyBird,
Yes i checked it datasheet already, could you please explain about your experience in outdoor test? (day or night)
Another question, What happens if an obstacle appears in front of the vehicle?

My experiments so far have been in brightly lit environments (indoor and outdoor, but mostly indoor). Either extremes of illumination (direct sunlight and complete darkness) would likely be impossible for the T265 since it relies heavily on vision (the V in VI-SLAM) for accurate localization. In between, you can test the device in your target environment and see how it goes (i.e., just flying and record the data, no integration with ArduPilot).

If the obstacle is too close and covers up the cameras’ field of view, the T265 may be able to relocalize itself once the FOV is clear (with default configuration). From the FAQ:

Can T265 re-localize after being kidnapped?
Intel® RealSense™ T265 can re-localize after kidnapping, providing there are some features in view which are in its internal map. In the case of a completely new environment, it will continue to provide relative pose data until absolute position can be re-established.

I found this thread which might provide you some info or a point of discussion with Intel.

1 Like

Hi sir
I have successfully installed this project properly. and I have done a ground test and there are no problems, all connections can run well and for a long time. but when the manual test was done, there was a bad vision position on the mission planner under 5 minutes. Any suggestion i will appreciate. thanks.