Video Games In Real Life: Duck Hunt + Drones

Video Games In Real Life: Duck Hunt + Drones

I wanted to share this with the Ardupilot community. It’s been a fun project to work on!

We used a 3D printed drone with a Pixhawk running a modified version of Ardupilot, combined with ROS (Robot Operating System) and MavROS running in Docker containers.

For the gun we modified a big buck hunter gun and hooked it up to a Raspberry Pi placed IR LEDs in it. We then focused it through a lens and transmit a IR pulse (similar to how your TV remote works) to determine when a drone gets hit! If the drone gets hit it does a flip!

Next up: multiple drones + swarm mode?


I doubt that I’m alone in saying: MORE!

EDIT: And as I think about it a little, I can envision a means by which you could accomplish the IR detection and flip solely within ArduPilot via Lua scripting. But swarming still requires a little more fancy footwork on the GCS side.

1 Like

I could defiantly use some guidance on the best way to do the IR/Flip with the least amount of latency.

Would be cool if I can figure out a way to have 2 guns with different IR codes. There are speakers so it could have a audio (voice) based scoreboard!

We can help with that for sure. I’m confident neither of those problems are particularly hard to solve.

How are you presently detecting a “hit?”

1 Like

I can give a deeper breakdown tomorrow if its helpful.

And I know there’s a better way of doing this (please help!).


  • Drone is in GUIDED mode
  • On trigger (if loaded) Gun shoots IR code (similar to a remote control code)
  • There’s a IR receiver on the drone which I have wired into the ADC 3.3V port on a Pixhawk 2.4.6
  • I then hijacked a Battery2 message with IR receiver reading and publish it back to ROS(Robot Operating System)
  • ROS then evaluates if the IR sensor was hit by IR light (just a simple voltage threshold currently)
  • If hit - the drone enters FLIP mode followed by GUIDED mode once the flip is complete

Thanks for any help/guidance you are able to provide!!

Guided mode is likely the right choice for the entire flight regime.

We can hijack the same battery message using ArduPilot Lua and provide guided mode aerobatic commands onboard.

Even better, it’d be good to craft a simple microcontroller based IR sensing device that would connect to the autopilot’s I2C, serial, or CAN bus. A Seeduino XIAO + some simple IR detection circuitry and code would be an excellent, lightweight, fully featured candidate.

With a microcontroller onboard, we can implement IR detection algorithms in Lua (or on the microcontroler itself) to differentiate players (not unlike television remotes). The onboard script could keep the drone’s individual scores from each player and downlink them as gcs:send_named_float() values. The GCS would then total the scores from each drone to determine a win condition, and the GCS could send a game reset command to wipe the scores and start again.

Using ROS provides a powerful way to craft trajectories that would challenge the players while ensuring mid-air avoidance. I have less experience with ROS than with the onboard Lua side of things, but I know its potential for sure.

In fact, it would be interesting to have a near continuous IR code sent from each gun such that a drone would “know” that it’s being targeted and perhaps initiate some sort of avoidance maneuver that isn’t impossible to defeat but would provide further challenge…


Continuous IR code is a great idea!

I wonder if we could use something like this for the IR sensors. I only have one right now and the drone has to have the right heading to be able to hit.

A microcontroller makes it easy to solve the heading problem. Evaluate an array of IR detectors for a valid code. Emit a player code to the autopilot over the bus of choice when detected.

Said sensor array can be evaluated as the sum of its parts, if that helps to disambiguate.

The XIAO board that I linked is pretty powerful and has plenty of GPIO pins for such a task. I just weighed one on a calibrated scale, and it clocks in at just over 2 grams.

1 Like

Just ordered some!

I need to dig deeper at which type of sensor array to use.

1 Like

I’ve been extremely impressed with the XIAO and have gravitated toward it for any non-networked microcontroller project of late. It’s easy to integrate into the Arduino IDE (ptooey!) or PlatformIO (much better!).

You should probably include LIDAR altitude sensing in your drones if you haven’t already. An absolute, accurate altitude reference will likely prove invaluable in swarm operations. You could determine threshold altitudes for flip/avoidance maneuvers for self-preservation.

1 Like

Which bus would be easiest/best to integrate into?

Serial is probably the easiest, but I2C isn’t far behind. CAN is likely most robust but requires more hardware for microcontroller integration. Pick one and run with it.

This recent addition makes I2C very attractive:
AP_Scripting: i2c multi byte read by IamPete1 · Pull Request #21686 · ArduPilot/ardupilot (

However, if you have a serial port to spare (after LIDAR integration), it’s still the easiest one to code.

As another thought - the guns could have GPS tracking as well, and ROS could ensure that the drones always “face” the average of the headings toward each player, such that the sensor arrays are always oriented correctly (unless an avoidance maneuver is initiated). Alternatively, the GCS could emit its location, and the players would stand near it, to the same effect.

1 Like

ROS seems overkill for now, a simple pymavlink watch for BATTERY2 message would be enough ! (unless you got something else running)

You could typically, put a fence on the drone, get it from your commander script, and then use GUIDED to move randomly the drone into the fence area ! That is pretty easy to do.
I don’t know if it is enabled, but I think we got a DO_FLIP message, that could be simpler than do a mode switch

1 Like

I appreciate all your help!! I ordered some stuff to play around with. Will report back!

A few other thoughts:

Right now the GCS is the raspberry pi which mounted to the gun. So definitely might need to rethink that if using multiple guns.

I have a lot of experience and micro-services I have built with ROS which is why I chose it. It gave me a easy way to put Text to Speech (voice) into the game. I figured it would help with multiple drone control as well.

I’ll have to do a video with a more technical walk through video in the future. Again I really appreciate both of your suggestions on how we can take this to the next level! Keep them coming!

Do you have any experience/thoughts on what kind of LIDAR to include?

Anything on this page will work:
Rangefinders (landing page) — Copter documentation (

I think the TeraRanger series represents good value for money. Avoid the extremely cheap TFMini offerings from Benewake.

1 Like

Thanks again for all your help!

So in the current configuration we run the gun with Raspberry Pi mounted to it off of a drone LiPo.

One current issue that I would like to to address is that if the battery for Raspberry Pi dies (the telemetry connection is broken).

If the Raspberry pi battery dies I would like to drone to land - so I figured I might be able to use the GCS fail safe to do this but I haven’t had much luck

I was trying to get time_sync working right (which I think is required to use this) but haven’t had much luck.

Getting errors around the “TM : RTT too high for timesync: 430.15 ms.”

I have played with the BRD_RTC_TYPES setting and SCHED_LOOP_RATE.

Any idea if I am going the right direction here?

I think you’re simultaneously going in right and wrong directions.

For your present hardware config, it seems you’re on the right track for time syncing. If you’re messing with SCHED_LOOP_RATE, be sure to reboot between value changes. Seems a bit odd that the default loop rate isn’t fast enough on Copter firmware, and an RTT of nearly 1/2 a second may be indicative of another problem.

However, the GCS should likely be independent from the gun. I think the gun should just be a simple IR emitter, and the GCS should be contained on a dedicated computer on a reliable power supply.

Some feels “off” to me as well do you have any hunches as to why the loop would be so slow? I am pulling all the telemetry data over a 57600 baud rate link so maybe that’s an issue?

The baud rate wouldn’t account for that amount of time lost.

I wonder if it’s something on the Pi side of things that’s slowing things down. To be perfectly honest, I’ve only barely scratched the surface of ROS, and I’ve never actually used the time sync feature.