I need to build a custom visual tracking script for Dronekit. It will incorporate 2 drones, one on each side of the subject. I figure this job could be an alteration of the ‘red balloon’ script that is on the ardupilot board.
If anybody is interested, I can give a few more details on the job, including build out, equipment, desired result and pay. I am in Southern California, but obviously am willing to work with someone from anywhere.
I have decided to give a more in depth post of what I am trying to achieve.
I want to have 2 drones, one on each side of a subject, independently track and follow a target based upon a visual lock. I would like for these drones to be communicating to QGroundControl for general tracking and safety.
The subject will be cyclist. As the cyclist rides around the track, I want the drones to be on either 90 degree profile of the cyclist. The best option I have found currently is to use dronekit with an OpenCV script. If anybody has a better option, I would be very open to that.
I have some options for a parallel based system that will film and transmit the video footage through wireless 802.11 protocol to a basestation computer. This way, you can see the live feeds from the drones.
I’m currently working with an Odroid XU4 as the companion computer for each Pixhawk 2 that are driving the drones.
One drone is a 7in quad, the second is a 15in quad. I have flow these for all sorts of manned and semi autonomous missions, but doing it with a visual lock has been outside of my expertise.
Is there any chance someone might be interested in taking this up? I am open to suggestions on budgets.
why not put a flight controller on the Cyclist with GPS and telemetry?. Then you can just use follow mode, probably with the current stable release. Object recognition and tracking is very tricky even you put loads of QR codes on the subject or as with a balloon example have a very distinctive shape and colour to look for.