Weed recognition and sub waypoint planning in a mission

Hi Guys there,

we have a interesting quest that some of you might want to reflect on. We have a drone with a multispectral camera, that can make maps with dots, hot spots, that are identified as weeds (using MA). These dots can be localized as GPS points, as we know drone GPS position and height of the drone and the camera specs. Question: which software program (ROS?) can be used for GPS coordinate making of these hot spots on the image. If the image would be 10 by 14 m and picture pixel size ca 1000*1400, each pixel would be 1x1 cm. Accuracy with RTK ca 2 cm.

In the ideal scenario the drone flies an auto mission over the pasture land stopping every 10 meters (using waypoints spaced 10 m apart). At each waypoint it is shooting a multispec picture. That is analyzed with machine learning in seconds. And as mentioned above, given GPS coordinates. So next question: can there be a submission routine in that waypoint
position in which the drone would go to all these hot spots in that 10 by 14 m area and carry out a task (eg spray, drill, perforate the weed). Meaning also that the subwaypoints could have a sequence of least travel (using the travelling sales man algorithm). We are not clear if there is such possibility in arducopter or that people are working on it?

An alternative is of course that all hotspots of each image would be collected and only later, i.e. in a second flight a automatic waypoint generation mission would take place to go to all these hotspots and treat them (also using the same efficiency algorithm). This way is requiring 2 flights, which could have some advantages as it could have 2 different drones, one with only MS camera and one with treatment equipment.
Then in this scenario the question is if we would have a list op coordinates, how can we import those in mission planner to make waypoints mission? Is there an easy way (small app)?

Possibly we are too ambitious, but let me know if you want to share any thoughts on this?

Best, Winfried

Can be done with companion computer. Shouldn’t be too hard with some pymavlink skills to process the points that come out of machine learning analisys.

regards,

Corrado

Dear Corrado, thanks for the swift reply! Interesting comment, now we should be elaborating it. What is the accomp computer level we need Raspberry PI, ODROID, higher… Is this python-mavlink language easy to master? Would be interesting to know what kind of effort it would take to make a system, with the machine learning data as basis. Best, Winfried drone4agro.com

Pymavlink is the python library you will required to communicate/interface flight controller and any companion computer, you can also look for using MAVROS(I haven’t used it yet) which also use pymavlink/mavlink protocol inside providing another layer of abstraction for ease of use, but it will save a lot time related to implementing ML algorithms and interconnecting different ROS modules on the companion computer

Thanks, Notorious7! Guess it is now time to get a programmer to start working on it!

Cheers, Winfried