I have seen some implementations of the T265 on rover, so it’s definitely possible. Keep in mind that this is a vision-based device, here’s my 2 cents:
Make sure that the T265 is not located too close to the ground such that the field of view can cover as much of the environment as possible.
The tracking quality and the USB connection are heavily affected by vibration, thus damping might be necessary to limit the amount of vibration exerted on the camera. The less vibration the better.
Under rapid movement, tracking can become lost temporary. However, the SLAM algorithm on the T265 can relocalize using previously built map of the environment, so the tracking can recover when the movement slows down. If you are running on the same track/environment, you can build a detailed map during testing, save (export) and reuse (import) it later on.
Another potential problem is the acceleration range. The Bosch BMI055 IMU used in the D435i and T265 says that the component supports ± 2g, 4g, 8g or 16g. But according to the device datasheet, the device range is ±4g. My guess is if you accelerate / decelerate outside of this range, the IMU will saturate and the SLAM might fail.
The T265 requires a rich visual environment nearby with low reflection. Water is a big challenge for visual odometry as it has no specific features and lots of sporadic light flashes caused by sun reflection on waves.
appreciate your advise and LuckyBrid’s suggestion, maybe that is not proper to be used in boat, just as you say: 1.the specific features point and flash
2.the vibration and need to be a distance to the ground
thanks you again.
Working on a long range visual triangulation systems , just like the early sailors did could be an interesting project. You could use a long range camera installed on a precise azimuth control system to identify and triangulate from different shore features, this way you could localize and map (SLAM) .
Thank you! Worked well using the USB method, however also tried with the serial pins, but there is no connection, I’m using the gnd tx and rx ports, but I’m not sure about de tty port asigned in de up board.
Using the USB, I can stablish the connection and start sending the pose estimation from the T265 camera, but in the QGControl interface in ubuntu there is no widget button to watch the Mavlink inspector and see if there ArduPilot is receiving position data by viewing the topic
VISION_POSITION_ESTIMATEalso in mission planner Ctrl+F is not working, I downloaded both from the official sites, in Windows I’m able to open te Mavlink inspector but since I’m using the USB I cannot connect both, companion computer and my laptop. Any advice?
For QGroundControl you can use MAVLink Analyzer widget, see here.
@LuckyBird I believe widgets were removed on the latest versions (I’m on master)
@LuisVale Thanks for the info. Do you know of any alternative functions in the latest version for it?
I just downloaded the executable file from here and the widgets are still there, so @Eduardo92 you can still give it a try.
Thank you! It is working now! Also I was able to configure the bluetooth hc-05 module! The Estimate pose is being recieve by the ardupilot, I’m ready to fly! just one more thing, I’m starting the modules using ssh in my laptop, there is a why to get the rviz window trough ssh? similar as the python windows?
I am trying to use T265 camera with raspberry pi 3 for a similar project as described in this post, but get below error after some minutes:
19:57:13.679  [E] Device-63B0: FW crashed - got error in interrupt endpoint thread function: status = -1 (LIBUSB_ERROR_IO), actual = 0
19:57:13.932  [E] Device-63B0: State [ACTIVE_STATE] got event [ON_ERROR] ==> [ERROR_STATE]
19:57:13.932  [E] Device-63B0: Entered state [ERROR_STATE]
I tried with ubuntu mate and ubuntu server as well as updating realsense library (I used v2.23, v2.24 and 2.27), but got same error.
Did anybody face the same issue?
Hi @jcgarciaca, this might be a USB cable issue.
- Are you using the original USB cable came with the T265?
- Are you using a USB hub? Can you try connecting directly to the computer’s USB3.0 port and see how it goes?
Hi @LuckyBird, thank you for your answer.
- Yes, I am using the original cable
- I am not using any hub, but connecting directly to raspberry USB port. I tried with my PC (USB3.0 port) and it doesn’t show this error. However the raspberry only has USB2.0 ports.
@jcgarciaca are you running some applications that require images? If so, then USB2.0 is not enough since USB3.0 is mandatory to access camera streams, thus RPi will crash. For the use case that is described in this blog, we are not using image but only pose streams, hence the RPi is sufficient.
Hi @LuckyBird and @Matt_C,
I am not using images, only pose streams too. But I don’t get it working in my rpi 3.
I just upgraded to rpi4 and it seems that the error dissapeared.
How are you power supplying the companion computer?
I used this power module to convert from 3S battery to 5V for RPi.
Thank you for your answer, I just ordered a couple of converters.
I have another question, wich telemetry module are you using? I´m trying to use the 3dr Radios at 915 kh, but the VISION_POSITION_ESTIMATE Drops to 2-5 hertz.
I am using this CUAV 915mhz and can get up to 20Hz, but occasionally the rate also drops to below 5Hz as well.
Those are pretty similar to the ones I have, which parameters are you using?
Only Bluetooth gives me stable hertz rate.
To access to the raspberry pi remotely are you using vnc?
Not using vnc, I am ssh-ing to the RPi through wifi and run the commands through terminal.