I am trying to autonomously lift-off, fly to a particular way-point, decipher a QR code with video, and fly back to land with an RC heli inside of a field house. The problem is that there is no GPS signal inside the field house because of the metal roof and walls, therefore creating somewhat of a Faraday’s Cage.
I was wondering if this autonomous flight mission is still possible with reasonable accuracy with the Ardupilot without the aid of any GPS (can it still resolve attitudes and heading using the on-board sensors and filters alone)?
Maybe going with something like sonar aided navigation is preferable in this situation?
I am fairly new to this kind of stuff so any insight would be appreciated.
In a nutshell: No. With the existing code, you absolutely cannot do navigation indoors without GPS. What you want to do could be done with a number of methods (SLAM), but would be a huge research project by itself.
Can anyone lead me to some good references that explain the ardupilot controller architecture and how it uses GPS to track current position and the position of the next waypoint to feed as a reference input to the trajectory control loop?
I also want to know if it is feasible to modify the controller/software so that it can replace the constant GPS fed position input with a relative position input calculated from the 9 DoF sensor suite, with kalman filter position error correction, if a set the take-off position as a (0,0) reference and the waypoint as a fixed position relative to the take-off point (x,y)? There will be error as there is no GPS position feedback as the RC is trying to fly to the waypoint but does this seem possible or will the errors be too large for the autopilot to control the helicopter to fly to the relative waypoint position starting from take-off?
I know I got a “No” answer for this but I have heard that this may be possible from another person (even with a “built from the ground up” system using an arduino and integrated sensor suite).
I would like to get some input from educated users of the platform.
So first off, I dont know how the ardupilot code does their waypoint tracking. There are numerous ways to do it, the simplest being just point at the target and fly forward. When you are trying to track a path, you can basically draw a circle around yourself (for outdoor UAV’s on the order of maybe 5m diameter), and where your circle intersects the path, that is your heading.
Theres nothing wrong with telling your heli to fly from (0,0) to (x,y). Its just that what the heli thinks is (x,y) is much different that what you think it is.
Technically, yes, you could turn the sensor suite into a INS navigation system, use a kalman filter, etc etc. The US Military can put a ICBM warhead up the skirt of a pick up truck using just INS, so I am sure something would work. The problem is still noise and vibration. If you don’t believe me, play with an arduino and an MPU6000. Your error just drifts off and off until its basically useless. Vibration makes it worse.
I work in the MIT ACL, where we have a vicon motion capture system to fly quads around indoors. The vicon gives us position and velocity data, while the onboard controller gets attitude and rates. Its crazy accurate and crazy expensive. The next best thing, maybe arguably vision based navigation, and there is an army of PhD students working on that…
I have no idea. I’m hear because I want to put my PixHawk on a Burgen Heli.
I’m just guess here, but could you do a follow me from phone to phone? And then place your phones in the area you want to fly autonomously?
I don’t quite understand what you are saying.