The project goal is cooperative human/machine navigation and user interface development.
Plan a route along a paved nature trail, AI behaviors can interrupt or modify the mission plan, any input from the onboard joystick overrides anything else and is obeyed immediately. Eventually I want the AI to recognize common navigation situations and generate a suggested mission plan. Example: enter through a doorway, follow the edge of a road, remain centered in a hallway, reverse into an empty elevator.
I use a wheelchair to limit weight bearing and shearing of my injured lumbar spine. This requires being reclined so far that I have difficulty seeing forward to navigate. An overhead touchscreen, forward camera, joystick, and seating buttons are my interface to the world.
Seeking overall project guidance to remain compliant with developer code of conduct while employing ArduRover on a human occupied wheelchair.
- ArduPilot is NOT certified for use in applications where ArduPilot is effectively in control of human lives. Members of the development team must not knowingly assist in projects where ArduPilot will be in control of human lives. “In control of human lives” includes but isn’t limited to manned aircraft.
At no point will ArduPilot be “In control of human lives”. I will be either onboard ready to take over manual control in an instant, or the chair will be unoccupied (verified by a seat sensor).
I am experienced building robots and wheelchairs, but new to Ardupilot. I explicitly hold harmless ArduPilot, developers, and forum members who assist in this noncommercial personal project.
A Beagle Bone Blue is on its way to serve as FC, let me know if I should be using something else. CC is a 4Gb Jetson Nano with Intel Realsense D455 depth camera and ultrasonic distance sensors for obstacle identification, and avoidance. CC is attached to a portable touchscreen which will be the main user interface besides the wheelchair joystick.
The wheelchair is a 2020 Permobil M3 which uses R net (CAN) control system and 24v battery. It has been successfully remotely controlled via WiFi using GitHub - redragonx/can2RNET: This repo has code and documentation to control power-wheelchairs with R-Net electronics. python software.
When a GCS is needed, it will be either a R Pi 400 w/touchscreen, or a Samsung S9 running in Dex mode on a touchscreen.
Community input is appreciated to help select most appropriate GCS, configure onboard driving interface, and write the necessary code to interface with can2rnet library.
I’m so new at this I don’t even have firm questions yet, but I know from searching “wheelchair” on the forum that the ethical issue would come up. While waiting for my FC to arrive I hope we can tackle that and agree that this is an Ok project. If anyone thinks this type of project would violate the developer code of conduct, lets talk about that please.