I am building some kind of agricultural vehicle.
After creating a extended Mission (zig-zag with 1,20m width over >2hectares) to put the seeds/plants in, I thought of doing that mission again for removing weeds. the plants look like this and as you can imagine, 2 cm left or right with the tools in the ground would damage everything.
thats why I would have a companion-computer calculating visually how far off the tool is. so my question is:
how could I influence the autopilot?
would it react to a pwm-signal on a specified port?
should I feed the info by “faking” a 360lidar?
An onboard camera, a GPS RTK, and an array of sensors allow it to identify crops and chart its own course of travel, as well as detect the presence of weeds among the crops. Once it recognizes the presence of weed, two hands under the contraption lower themselves and apply a micro-dose of herbicide to their exact location, systematically targeting the detected weed without wasting any chemicals.
So using a GPS RTK based mission, you can use a Companion Computer with a Trained Weed Neural Network to guide a 3 axes robot arm to inject herbicide at a very specific location and keep map of it.
thank you for your input. I have seen several vehicles on the internet already, but that was not the topic of my question.
spraying poison on the weeds is NO OPTION
my question was: how (technically, no details) would my software on the companion-computer get his influence to the autopilot?
Please take note that many of theses system use the 3D Arm to remove weed mechanically
As I wrote before , the CC is not directly controlling the ROVER as the FC is already running on a Mission. So you have 2 options, either you break the mission to take control using MavLink (more specifically pymavlink running on CC) or you can break the mission using Lua Script once the CC detect a ROI (Region of Interest) and do specific mission on this area.
In both cases you can issue command to restart mission from the last break point.
Other option would use the AVOIDANCE controller so the detection could be used as repulsive mode ( instead of attractive) to guide the vehicle, it is just a logic state you apply to the detection system. This mode has the advantage of complement the existing avoidance sensors you may implement in the vehicle… especially it is a 5 ton tractor
In case it helps we’ve recently improved our MAVLink Interface wiki pages which provides info on which messages ArduPilot supports and how it interprets. The pages also include some examples that can be copy-pasted into the SITL simulator (aka MAVProxy).
ok, so after a night with thinking and not so much sleep, I have to thank @ppoirier for his input with the lua-script driven “mission” in guided-mode. thats probably the coolest idea so far. thank you!