Thanks for the confirmations and sorry for the late reply.
The representation of the data plays a big role in this. Treating the translation and rotation separately would not be the least troublesome solution in my opinion.
OK, I think I understand. I guess one problem is also, that I can’t restore the correct rotation order between M and Q. I think it will make a difference, which angle is applied in what order in order to come to the correct result.
What I would recommend is to use the 4x4 homogenous transformation matrix, so both the translation and rotation are dealt with at the same time.
To be honest: I already tried to no avail. As said: I must have made the same mistake again and again.
Following your notation, we will need to multiply the inverse of the matrix at M with the matrix at Q to find Q’ - the relative translation and rotation of Q in the frame defined at M. The order of multiplication is important .
OK, I have read something like this already elsewhere and I’m sure I tried that too
I will try again and report. Thank you so much for your help.
Yepp, that works perfectly. You are a genius Many, many thanks for your help. I was close, but not close enough and even now I lost the overview several times
One observation, though: The Euler angles seem to have different orientation as a mathematic rotation: A left turn makes angles negative, right turn positive. Same for pitch: Up positive, down negative. And roll: rotation over the left wing negative, right rotation positive. I think that is ok.
Then: For a 2D representation I think it would be the best to provide just coordinates in the Euler XY plane. I guess, calculating the euclidian distance is not helpful. What do you think?
I observed a strange up and down in the Euler Z coordinate. Just a couple of cm, but it changes, while the originating Y coordinate remains stable… Strange.
Needs more tests.
EDIT:
Yes, this is really strange: While x and y and pitch, roll and yaw of the result are absolutely plausible, I’m having a little problem with the Euler Z coordinate. This increases with negative sign, if I move away from M and forward along the X axis and returns to 0 if I return to M again. If I pass M towards the opposite direction, it decreases with positive sign… This is additionally strange, since I can see, that the poses’s Y coordinate (which would be the Euler Z * -1) is not changing at all that significantly. Can I reach you by mail somehow privately, to not pollute this thread any longer?
Great to see your drone with jatson nano. Currently I’ve build drone with Jetson Nano as well, but I have an issue. When I powered jetson nano from baterry 3sLiPo, jeston nano shutdown as soon as I run all ros nodes with neural networks and provide video stream from video camera. It works fine if I’m powering from socket. How are converting power from lipo for your drone?
Hello,
If you look at the picture, you can see the dedicated UBEC 5A on the left side of the CUBE. And this is powering the NANO using the Barrel Jack.
Hi @Alexandr_Skachkov, Check this web site for more info, do not forget to add a jumper on J48 connector. and if you want more power for the AI, This helps me lot sudo nvpmodel -m 0 to get 10 Watt mode. after the above setting, it runs smoothly with (yolo3 and raspberrypi camera)
Hi,
This is excellent for indoor navigation. I am trying to implement vio with intelrealse. My hardware consists of an Omnibus f4 v6 nano running the latest copter version 4.0.3[Tried previous versions as well], rpi3b, and realsense t265[Facing forward, USB port to the right] mounted on a f450. The realsense tracks perfectly when walking the drone in a grid pattern manually. But as soon as I takeoff and fly the drone in stabilize, the ardupilot raises an EKF Pos Vert Variance error. I have tried adding a lidar to the system but to no success. I have no data logs from the fc but I do have telemetry logs. I have tried the ros-implementation on the pi as well as a jetson nano. I have also tried a new intel realsense to no success. Any help would be greatly appreciated.
Link to tlog: https://drive.google.com/open?id=1_WqqTnQoG2rWYPqUbvrwGt4c4WG3PMJ_
should i open a new issue?
Hi @syedM, from the tlog it looks like the pose data from T265 diverged right after you took off, so the issue is the localization data. The position rapidly increased to very large values, which of course did not match the IMU and got rejected by the EKF (hence the error for both vert and horiz).
The problem might come from the environment (too few features/too challenging for VISLAM) or the vehicle (too much vibration/something blocking the view of the camera, for example), more possibly the latter since you had good handheld tests before the flight.
Thank you for the prompt reply. I will put the system on a different frame with a possible vibration damping and try it again. The vibrations are a valid culprit. As far as the concern with environment, the realsense reports a high confidence after walking the grid. I will report back as soon as I change the frame.
Sorry for this long message but I would like to share my experience and would appreciate any feedback on this.
I followed the VIO tracking camera for Non-GPS Navigation article on Ardupilot wiki (https://ardupilot.org/copter/docs/common-vio-tracking-camera.html). I am using pixhawk cube, raspberry pi, Intel T265 and teraranger tower evo lidar on X hexarotor airframe. I did a outdoor test flight with VIO where intel T265 was facing down with usb port to right. While Arming in Loiter Mode the QGC was giving me HIGH GPS HDOP prearm fail so I switched into ALT HOLD, armed and again switch back to Loiter mode. Initially the vehicle was hovering and maintaining its position perfectly. After I did some roll, pitch and yaw the vehicle lost its position and started drifting so I switched back to ALT HOLD and tried to get it back. Before landing the vehicle completely lost its control and started flying away from me. Eventually it crashed into a tree.
I did some more research on the crash and I believe my potential mistake could be that the camera was facing straight down and not tilted. Was ignoring the GPS HDOP error a bad idea? Also the ardupilot switched from EKF2 to EKF1 in between the flight, could that be the reason my vehicle was flying away?
About the HIGH GPS HDOP prearm fail , do you have a GPS onboard ?
If you look at the parametres , we disable GPS for this type of application (as the title says: gps-denied), so make sure it is the case and you should not get this type of message.
Taking off with a prearm error is a recipe for catastrophic flight. When the EKF is diverging (and ultimately crash) you should revert to very basic manual control and land ASAP.
That’s very interesting that you supplied the script log (@LuckyBird I guess that should be a requirement for support). Looking at it , it seems that the RPI might be ‘‘skipping beat’’ , making the EKF to get lost. Generally it would get into EKF FAILSAFE mode and LAND, but sometimes as you have experienced it gets into limbo at try to flight to the moon…or to the center of the earth ;-(
Recommendation: Make sure you got all greens on before takeoff, you may ‘‘walk the bird’’ and look at mission planner screen to make sure the systems works ok. Fly over a well textured terrain with good features on ground, beware of direct sun exposure.
Lastly , try to reduce the T265 rate to 15-20 Hz (INFO: Using default vision_msg_hz 30), to get your RPI a chance to process all the poses and sent them correctly over MavLInk.
Thank you so much for your feedback, Yes i have disabled the GPS in the settings. I remember reading somewhere in this forum that to stop getting the HDOP error you just manually enter the EKF origin value. That is what I was planning to do next if I see that error. I agree on reducing the vision_msg_hz. What do you think of the camera orientation, it was downfacing. I read on github topics that the camera should be slightly tilted to get correct yaw angle. Also did my obstacle avoidance fail because EKF switched from 2 to 1 ?
Having the T265 being tilted upwards during initialization helps. If you look at pictures on latest blogs you can see the quad front legs resting on a piece of wood for that purpose.
To get the EKF home, you can send through the script or right click on Mission Planner map and hit EKF home
@LuckyBird@ppoirier I did some test to confront the t265 ROS vs non-ROS. It seemed to me that using ROS I see a more stable loiter and smoother circle then with non-ROS. This feeling is confirmed by the plot you can see below.
Using non-ROS bridge (t265_to_mavlink.py) I see some “hole” in the VISP message that are the evidence of what I see in the real copter trajectory: in some points the copter slows down and then starts again.
Using ROS (realsense2_camera, mavros, vision_to_mavros) I see smooth trajectory when doing circle.
I am using RPi3 with Ubuntu 16.04 and I updated all software (librealsense, realsense-ros, vision_to_mavros) and dependences near one week ago.
Hello @anbello, it seems like the system has intermittent communication issue, while running the script. What transmission rate have you selected ? I suggest you reduce it to 10-15hz as the RPI may be skipping messages
Would you have the script messages output available to share?