GSoC24: Visual Follow-me using AI

I think SIYI QGC has this function, but I’m NOT sure if the code is on github

Please evaluate those Pi boards if you have the time. I hope Pi3B can do the job.

Yes, please give an update.

And thanks for you quick response. Hope to see the Visual Follow me!

.

I have been using a AMB82-mini for object tracking on rover, it has a built-in NPU so it doesn’t need a companion.

2 Likes

where can i find this?

Hi @geofrancis,

Sorry for the slow reply, I didn’t see your question.

We’ve got the calculation in two places

1 Like

I have installed QGC 4.4.0, and I didn’t see any icons related to object selection. Any idea?

I think AI tracking is kind of different from FollowMe mode. It might be some thing like below sequence(I’m NOT expect on this, I’m just learning, anything wrong, please let me know.):

  1. GCS select object detection area //MAV_CMD_CAMERA_TRACK_POINT/MAV_CMD_CAMERA_TRACK_RECTANGLE
  2. GCS send selected aera through MAVLink to FC
  3. FC transmits packet to AI module
  4. AI module detect object and feedback
    4.1 AI module feedback result to FC
    4.2 FC transmits packet to GCS
  5. AI module add an OSD box, indicating object is detected, which is show on GCS
  6. AI module execute algorithm to pitch/roll/yaw, tracking the moving object
  7. If there is an abnormal situation, such as object misiing, stop tracking // MAV_CMD_CAMERA_STOP_TRACKING

OK, Orin has 40TOPS, which is better than SIYI AI 10TOPS.

Any updates? I hope to setup an environment also, but … … If you have this setup, please let me know, thanks.

Hello! Great article!
I would like to contact you, we have a project for autonomous fire extinguishing from drones
You could help the project, what you write in your article are exactly the problems that I want to solve
You can write to me by email - kavaneural@gmail.com

This is a public forum, if you want help then ask publicly, no one is going to work just for you for free.

2 Likes

Of course not for free, what are we talking about, I didn’t even think about that
The task is the same as described in the topicstarter
But you need to automate the drone when extinguishing a fire. For example, it may lose contact when heated and must be raised automatically or moved away

@Kot_Lessay I think the project “autonomous fire extinguishin” you are doing now is more complex than GSoc24 Visual Follow-me using AI. Is it possible to create a new topic for further discussion if you need thoese experts to help?

As there are quite a lot of scenarioes which need to be consider:

  1. What type of fire extinguishing equipment should be carried?
  2. In what scenarios (oil, gas, etc.) is it suitable for fire extinguishing?
  3. How efficient is the fire extinguishing?
  4. What is the heat resistance capacity of the entire equipment?
  5. During flight, when there is smoke, what is the effectiveness of visual cameras, infrared cameras, and radar/lidar?
  6. How long can the equipment operate while carrying it?
  7. What about fire rescue in environments like narrow spaces in buildings and issues with dynamic collision with escaping personnel?
  8. Automatic safe flight route planning?

These are just very preliminary considerations. It is necessary to systematically evaluate and analyze all possible scenarios and parameters before making decisions on product specifications. Extensive evaluation experiments may also be needed (for example, the accuracy of visual, infrared, and radar non-GPS positioning in fire/smoke etc, and dynamic obstacle avoidance assessment).

@rmackay9 @khanasif786

I have seen GSoC Visual Follow-me funding request

Does this means It has to be Smart controller (Herelink, Siyi MK15, etc) . And QGC from GitHub - mavlink/qgroundcontrol: Cross-platform ground control station for drones (Android, iOS, Mac OS, Linux, Windows) doesn’t support object selection?

Yes you are right, my approach is a lot similar to you.

yeah it will be a smart controller, but eventually any of the QGC patch will work, i have to make some additions for the QGC part. For testing the gazebo part is ready i will post the version and how to setup that here.

As right now, I don’t have Smart controller (Herelink, Siyi MK15, etc). I just installed mavlink/qgc on laptop or android device. I didn’t find the menu to send those command.

mavlink/qgroundcontrol should support this, code(MAV_CMD_CAMERA_TRACK_POINT MAV_CMD_CAMERA_TRACK_RECTANGLE) are there. I’m NOT sure what I have got wrong, hope to see your version of QGC and setup:

yeah the thing is some specific PX4 messages enable that if i am not wrong. We need the same functionality running for ArduPilot.

I might be new to this but have there been considerations of using YOLO? This does not mean I do not trust Ardupilot’s inbuilt ability to calculate latitudes and determine the geolocation though.

YOLOv8 is mentioned in the OP.

1 Like

Maybe. I’m NOT sure if current QGC version support this AI tracking target command.

But it seems it’s supported by SIYI QGC, which has been confirmed by SIYI support(SIYI MK32 Enterprise Ground Station - 7" Display, 15KM Range, 4G+64G Android, Dual Operator, Abundant Interface - #177 by SIYI).

@khanasif786 are you going to use Siyi MK15 for GCS?

@lida2003 I will be using herelink. However we must not be limited to SiYi’s custom version.

That’s great!

Right now, I don’t have either of the smart controlller(here-link, Siyi MK15). Hope to see this patch for QGC soon.

PS: here-link is NOT support AI object tracking commands. I didn’t find any of this ai object tracking feature on mission planner also.

Let me take a deeper look at the documentation and hopefully pen my contributions to the project.