Pixhawk Rover follow indoors

Hi all,

I’m a STEM Educator working in a technical school setting using Pixhawk as a stepping off point for students to develop a product or concept.

One of my teams wants to make a rover (we use DAGU wild thumpers) that would follow a student around in hallways toting books or their lunch - or similar.

Is there a solution that can be implemented using RF or BT to simply track a student carrying an enabled cellphone or RF tracker like a Tile?

We can’t use GPS as this is an indoor device. Concept is to keep a 2 meter buffer space between rover and student and the vehicle shadows their movement to follow them to their classes.

Thanks for any suggestions

Could you use a beacon system for general positional information? Check this out:
http://ardupilot.org/rover/docs/common-marvelmind.html

And then follow mode via WiFi radios like this:

There are two parts to this I guess. The position estimation so the vehicle knows where it is, and then the follow part.

It will be tricky to get it working well without a position estimate. The best indoor position estimate setup we have is using ROS with Cartographer to send local estimates to ArduPilot’s EKF. I suspect this is more than you’re looking for though.

A simpler (but less accurate) estimation could be done with wheel encoders. Because you don’t really care about the vehicle’s absolute position this might be enough.

For the follow part I’d probably use an OpenMV camera. If either of the two estimation methods are used, then the camera could recognise the person (perhaps put an April Tag on their back) and then it could send SET_POSITION_TARGET_LOCAL_NED to the vehicle in Guided mode and it might follow OK.

If neither Cartographer nor Wheel encoders are used, a not-very-good solution could perhaps be made if the OpenMV camera sent RC_OVERRIDE mavlink message to the vehicle to essentially control the sticks. I really must say though that the final result will be rough.

thanks for the response Dave. I looked into the marvel mind system briefly but it requires a kind of closed environment if I understand it correctly. Students were thinking that it should be a virtual tether of sorts - keep x distance away from rf tag or visual element like a neon colored shape that the camera would track.

thank you for your help. I will share this with my students. I wonder if this can be achieved utilizing a computer vision approach as it doesn’t really need to know where it is as long as it is tracking a unique color/shape.

Hi,

your question got me interested to try something like this myself. I did a quick and basic test and was able to get my rover to follow a glyph/barcode. I used a USB webcam connected to the onboard PC of my rover. Lighting was one big issue I encountered. Going from a room with windows and sunlight to a hallway with artifical lighting resulted in half the recognition distance. Another issue was the FOV of the camera. Once the glyph left the camera view, the rover was lost. A solution to this could be a camera gimbal that keeps the image feature centered and the rover turns/moves to keep the gimbal centered. This could also be a solution to keep a certain distance to the tracked image feature. If the feature is carried by a person at waist height for example, the rover could move to keep the gimbal at a certain tilt angle.
I just coded the control I described above and it works really well for a first try.
I simply took the gimbal angles and used them as throttle and steering input. I set zero throttle at 45° up tilt. Anything above that and the rover moved back, anything below and the rover moved forward. Pan controlled steering of course. I had to be careful to keep the glyph in the camera view, because I did not implement a failsafe/timeout for the gimbal control and the rover would just go on forever in the last direction. I will try to optimize the system and try it outside.

1 Like