I am looking into the feasibility of an assistive device for a blind kayaker, They would like to set way-points down a river and receive instruction (sound/voice/haptic) to help them steer a safe course. Avoiding obstructions and other craft. They will be with other people so there would be a safety-net (so to speak). It seems at first glance that this system could provide the necessary back-bone to such a system. The only difference is that rather than control motors (like on a boat) we would need instruction. I am a competent Python programmer (with some C/C++) and have been using Raspberry Pi’s and BBB’s for years, so I am not afraid of custom electronics and programming. Do you guys think this would be possible, or am I wasting my time?
This is a cool idea but I do not think it will work on a river. I have run boats everywhere, on rivers it is a whole different ballgame. A kayak in rapids seems impossible.
We’re certainly not talking about white-water stuff. Just plodding along on a fairly large navigable river and the odd lake maybe. Not sure about the sea. That seems just too risky for a blind person…however we will need to walk before we run.
What sort of accuracy do you realistically get with this system. Can it use secondary GPS for instance?
Realistically you can get a few CM in open areas with RTK. So If you could map a course on open water you would have good position. Yes 2 GPS can be used for redundancy.
Please let me rephrase, Nothing is impossible. In my opinion “ArduBoat is not that far along right now to navigate rivers” is a better way of saying this. Sorry.
Because collision avoidance is still being worked on for rovers.
Well, he doesn’t need it to control the kayak, only provide feedback of how far off track the kayak is.
Does arrduboat use the L1 navigation?
You could potentially set it up as a rudder only, take that steering input and convert into a haptic/audio feedback, The trick would be to make it smooth enough for human use.
In additiona you would need to find out how far off track you are. I’m sure the info is available somewhere, but I’m not sure how to get it out of the controller at run time.
If I had more time, I’d be curious to setup a pixhawk with a GPS/servo and see if you could get anything sensible out of it.
If you are only looking for positional feedback it seems to me a Flight Controller running ArduRover will do nothing for you. What about one of the navigational applications for the visually impaired?
Very interesting idea and I think this is quite possible and ArduPilot could provide a lot of the infrastructure you’d want like the EKF, the distance calculations, the navigation…
It’s a bit like the human is the “motor driver” (which is in the AP_MotorsUGV.h/cpp files) but instead of sending pwm outputs you send voice commands… or maybe it needs to be one level higher … up at the navigation level which outputs a desired turn rate and speed.
The kayaker would probably want regular updates of the relative heading and distance…
… right now we don’t have a “Notify” library that can verbalise words although it’s been requested a few times.