Running a drone light show with ArduCopter and Skybrush

This writeup will demonstrate that it is now possible to run a drone light show based on open-source components only, using a modified version of ArduCopter as the flight controller and Skybrush as the ground control station. During the course of this blog post, we will compile ArduCopter’s SITL simulator, then place 20 simulated quadcopters on the CMAC airfield, upload a drone light show (trajectories and light program) to them and launch a simulation. Of course if you have 20 or more real quadcopters with the required communication infrastructure, you can try the same thing in reality using the same set of software tools.

Preliminaries

The instructions below were tested on macOS Monterey, but most likely it works the same way in Linux as well. We assume that all the necessary developer tools (compiler toolchain, Python, waf and so on) are installed on your machine; if you can compile the stock ArduPilot firmware, chances are that you already have everything that you will need.

First things first: install Poetry

We are going to work with multiple Python projects; there will be the build-time dependencies of ArduCopter itself, then there will be a Python-based launcher for SITL swarms, and the server component of Skybrush that manages communication with the drones. To keep things nice and tidy, we will use Python virtualenvs for each project instead of messing up our system Python with the dependencies. Internally, Skybrush components use Poetry for managing their dependencies, so follow the installation instructions of Poetry before proceeding.

Compiling the ArduCopter SITL

Next, we need to compile the software-in-the-loop simulator version of ArduCopter, using the Skybrush fork that adds support for drone light shows. Check out the source code from Github first:

$ git clone https://github.com/skybrush-io/ardupilot
$ cd ardupilot

We need to switch to the CMCopter-4.2 branch as the master branch simply tracks upstream:

$ git checkout CMCopter-4.2
$ git submodule update --init --recursive

We will create a Python virtualenv for installing lxml, empy and all the other build-time Python dependencies to keep the system Python nice and tidy. ArduPilot does not contain a pyproject.toml file yet to specify the build-time dependencies, so we just do it manually:

$ python3 -m venv .venv
$ source .venv/bin/activate
$ pip install -U pip wheel
$ pip install future empy intelhex pexpect

Now it’s time to compile the SITL:

$ ./waf configure --debug --board sitl
$ ./waf copter

If everything went well, you should now have a working SITL executable in build/sitl/bin/arducopter:

$ build/sitl/bin/arducopter
You must specify a vehicle model.  Options are:
  quadplane
  xplane
  [...]

You can now deactivate the virtualenv and step out of the ardupilot folder:

$ deactivate
$ cd ..

Installing an ArduCopter SITL swarm launcher tool

We will need to launch multiple simulated drones on the same machine, configured in a way that the drones are laid out in a grid, just like they are placed on the ground before the takeoff sequence of a drone show. The Skybrush repositories contain a helper tool written in Python, so let’s check it out. Since we are still being nice and tidy, we install it in another Python virtualenv. Luckily, the ap-swarm-launcher repository provides us with a pyproject.toml file that lists all the dependencies, so you can just run poetry install to install them as well as the launcher itself into a separated virtualenv:

$ git clone https://github.com/skybrush-io/ap-swarm-launcher
$ cd ap-swarm-launcher
$ poetry install

Now you can launch a swarm from the virtualenv using poetry, and point it to the SITL executable we have just compiled to launch a small test swarm:

$ poetry run ap-sitl-swarm -n 9 ../ardupilot/build/sitl/bin/arducopter

If everything worked well, you should see 9 instances of the ArduCopter SITL starting up. These SITL instances are configured in a similar way as ordinary show drones would be, broadcasting heartbeat packets into the void until a ground station connects to them. Press Ctrl-C to stop the simulated swarm as we will now install Skybrush itself. Do not forget to step out of the ap-swarm-launcher folder:

cd ..

Installing Skybrush Server

Skybrush is based on a client-server architecture; the server is running in the background and managing the communication channels to drones, RTK base stations, weather providers and so on, while a frontend application called Skybrush Live provides a nice graphical user interface that you can interact with. Skybrush Live itself does not need to know what sort of drones it is communicating with – the server provides a module for handling MAVLink-based outdoor drones, and takes care of translating MAVLink-specific commands to a set of messages that Skybrush Live understands. The advantage is that if you are working with other types of drones that are not based on MAVLink messages, you can still keep on using the same GUI frontend.

The Skybrush homepage provides pre-compiled executables for the server that you can just download, install and run, but these executables are limited to at most 10 drones at the moment, and we will need more. However, since Skybrush Server is entirely open-source, you can check out the source code from Github and assemble it yourself and this version will not be limited by any means. So, let’s get started – yes, you guessed it, with yet another Python virtualenv:

$ git clone https://github.com/skybrush-io/skybrush-server
$ cd skybrush-server
$ poetry install

Once all the dependencies are installed, we can launch the server with the default configuration file for outdoor MAVLink-based drones:

$ poetry run skybrushd -c etc/conf/skybrush-outdoor.jsonc

You should now see something like this in the terminal:

Keep the server running in the terminal as now it’s time to install Skybrush Live, which needs a server to connect to.

Installing Skybrush Live

Unlike the Python projects so far, Skybrush Live is written in JavaScript (even though it is a desktop application). In theory, you could check out the source code and build Skybrush Live yourself (as I said, all components are open-source), but the pre-compiled executables on the Skybrush homepage contain absolutely no limitations, so it is probably just easier to grab the installer for your platform and run it. Here is the direct link to Skybrush Live downloads. Run the installer and start the application; it should find the server running in the background automatically and connect to it. You will know that the connection is successful if you see a small green dot in the header, in the upper right corner of the “plug” icon. If the connection was unsuccessful, you can click on the icon and connect to localhost, port 5000 manually.

Now we have all the components – time to run the show itself!

Running a drone light show

The largest area of the window of Skybrush Live is occupied by a map view by default, but the entire workspace is tabbed and reorganizable. Right above the map, there are multiple tabs that you can use to switch to a list / grid view for the drones connected to the system, or to a 3D view. Drag the tab labeled “UAVs” (this is the list / grid view) and attach it to the top of the map view so the map view gets split in half; the upper half of the window will then be occupied by the list / grid view and the lower half by the map itself. (Or, if you prefer any other layout, that’s fine, I just found it the easiest for this demo to have the map and the UAVs list visible at the same time).

The right hand side of the window contains a panel labeled Show control. This is the panel where we will spend most of our time. It is organized as a “check list” that you should go through in order to start a drone light show. Let’s start by clicking on the button that is currently labeled as “No show loaded; select or drop a show file here”, and opening the show file named demo-show-20.skyc that is attached to the bottom of this blog post. (The attachment is zipped because Discourse does not allow arbitrary file uploads; extract the ZIP first to get the .skyc file). The show file contains the trajectories and light programs for all the drones in the show in a compressed format. Opening the same show file in Skybrush Viewer yields a real-time 3D rendering of the show; feel free to play it in order to get an idea of what you should see on the sky when running the show with real drones. The visualization also reveals that the drones will start from a 4x5 grid with an initial spacing of 5 meters between drones.

Now that the show file is loaded, let us start a virtual drone swarm using ap-swarm-launcher from another terminal. We will configure the launcher to use 20 drones, arranged in a 4x5 grid with 5m spacing, and we also add a bit of noise to the positions and headings of the drones to simulate how they are likely to be arranged in reality:

$ cd ap-swarm-launcher
$ poetry run ap-sitl-swarm -n 20 --num-drones-per-row 5 --spacing 5 \
      --pos-noise 0.5 --yaw-noise 10 ../ardupilot/build/sitl/bin/arducopter

Switch back to Skybrush Live, and if you did everything correctly, the server should have already established connection with the 20 drones and posted their telemetry data to Live. The drones should be visible in the UAVs view and the drone tally widget in the header of the window should show 20 drones:

At this point you can switch the UAVs panel to a detailed list view with the button at the far right end of its own toolbar. If you click on the “Fit all drones” button in the horizontal toolbar overlay of the map view, the map should also scroll to the Canberra Model Aircraft Club, which is used as the default location in ap-sitl-swarm, and you should see the 20 drones arranged on the field. Zoom out a bit if you prefer. If you don’t see the drones for any reason, make sure that the server and the SITL simulators are both running the background, in separate terminals.

The next step in the Show control panel is the Setup environment button, which is used to set the origin and orientation of the show coordinate system that used to map the local XYZ coordinates of the show file to GPS coordinates. Click on the “magic wand” icon next to the text fields where the coordinate system is shown, and Live will automatically match the current coordinates of the drones with the takeoff layout of the show and calculate where you should put the origin and what orientation you should set. It also matches each drone to a takeoff position. Before closing the dialog, we need to do two extra things:

  • select “Altitude above mean sea level (AMSL)” to ensure that the drones are controlled based on AMSL. The AMSL reference was already set earlier when we have fitted the show coordinate system to the positions of the drones.
  • click on “Copy show origin to map origin” to ensure that the 3D view uses the center of the show as its own origin (otherwise you will not see anything on the 3D view when you switch to it)

Once the show coordinate system is set, a yellow polygon will appear on the map; this is the convex hull of the area in which the flight will take place. Within the polygon, “under” the drones, the map also shows small yellow triangles for the designated takeoff positions. (You can see these if you temporarily hide the UAVs layer in the Layers tab of the main window, in the upper right corner). Next up is the Setup takeoff area step in the Show control panel, but we can actually skip that for now because the system already matched the drones to their takeoff positions and we know that all of them are arranged facing the X axis of the coordinate system of the show. (In reality, you need to return to this step if you swap drones before the show starts due to sensor malfunctions or other issues).

Now that the drones are arranged, the show coordinate system has been set up and each drone has been matched to its designated takeoff position, we need to set up a geofence around the show area to ensure the safety of the audience. Click on the Setup geofence button in the checklist of the Show control panel, and simply press the Apply button at the bottom. Live will automatically expand the convex hull of the show by a few meters in each direction, simplify the new polygon so it does not have more than 10 points, and then draw it on the map with a red dashed outline. This will be the geofence of the show.

We can now upload the show trajectories and the geofence to the drones. This is done by clicking on the Upload show data button and then pressing the Start button in the upload dialog box:

Since we are working with simulated drones now, the upload is going to be fast and it should succeed for all the drones immediately, but in reality we sometimes need to try the upload multiple times if some of the drones are far from the GCS; the “Retry failed uploads automatically” checkbox comes in handy in such cases because Live will keep on trying to upload the trajectory to the drones, even if you close the dialog box and move on to do other preparations.

There are three remaining formalities before starting the show: we need to sign off on the Onboard preflight checks and the Manual preflight checks in the appropriate steps of the Show control panel. The “Onboard preflight checks” dialog is a summary of the error codes that the drones are transmitting to us; this is the panel where you can see if any of the drones are failing their pre-arm checks. The “Manual preflight checks” dialog is a list for items to be checked by the operator before starting the show that cannot be automated, like gauging wind speed and weather conditions, or checking whether the batteries of the RC transmitter (if any) are fully charged. You can tweak this according to your own routines and the requirements of the local CAA in the Settings dialog of the app (or even turn it off completely).

Once the preflight checks have been completed, it is time to set the start time of the show and grant the final authorization for the show start. Click on the Set start time dialog, set Start signal to “Start show automatically”, press then button labeled +30s to set the start time to 30 seconds into the future, click on Set new start time and then press the AUTHORIZE START OF SHOW button at the bottom of the show control panel. This last step is very important as it provides a single-click option for you to grant or revoke authorization any time.

After authorization, the entire show control panel disappears and gives way to a block of six large buttons for the most common operations that you might want to use during a show in case of emergencies - but normally, you can just sit back, relax and watch the show. The show uses a staged takeoff where drones take off in pairs, so you will have to wait a few seconds after the countdown hits zero before you start seeing some action in the UAVs list:

When the drones have landed at the end of the show, you can revoke authorization in the Show control panel, and then stop Skybrush Live, the server and the simulator instances.

Closing words

This demonstration used 20 drones only, but Skybrush Live and the server itself contain no built-in limits for the number of drones that it can manage. On my machine (a Macbook Pro with an Apple M1 CPU), I can easily run 100 instances of the SITL simulator and test shows with 100 drones. The server contains another “virtual drone” extension module that runs a simplified simulation of drones instead of relying on full SITL instances in the background, and with using this module I can easily manage 300 drones or more with a single machine. We are also routinely using Skybrush to run drone shows with our own fleet of 110 drones (see an example video here).

One potential limitation of the current architecture is that the MAVLink protocol uses a single byte only for the system ID of a drone. If you use MAVLink system IDs to identify drones, you are limited to about 250 drones in a single network (because you need to reserve at least one ID for ground stations, and system ID 0 is reserved for broadcasts). An easy workaround is to run multiple independent MAVLink networks with multiple routers, each router managing 250 drones at the same time, or to use the IP addresses of the drones in a larger IP network to distinguish them instead of their MAVLink IDs.

If you are interesting in learning more about the system, head over to the Skybrush homepage for more information, or join our Discord channel where we hang out regularly to discuss development plans, drone hardware for light shows and we also answer to generic support questions on a best-effort basis.

Also, do not hesitate to leave a comment below if the instructions above seem to be wrong, or if you managed to figure out how to do it on Windows.

Attachments

demo-show-20.skyc.zip (60.2 KB)

32 Likes

Awesome work! Would like to try this soon

Amazing! Thanks for sharing!

I think this would work well for static images, however I’m not sure you could do dynamic shows if you are just setting waypoints for the drones because each drone would need a different speed.

We are not simply setting waypoints, that’s why we have a patched firmware :slight_smile: The patched firmware adds a new “drone show mode” that feeds the position controller with position and velocity targets at a high rate. The underlying trajectory is defined as a series of linear segments and Bézier curves. We evaluate the curve and the derivative (for a velocity target) >10 times per second and update the targets in the position controller directly.

4 Likes

I have been trying to control a single drone accurately but had no luck so far. I suppose you have a very high WPNAV_JERK and WPNAV_ACCEL?
I have found that my problems of control might be solved by specifying acceleration too when setting a velocity target.

Great work! When you perform actual flights, do you have communication between the vehicles? Do all vehicles communicate with the ground station? I’ve been looking at different radios and communication setups for swarm flights - do you have any thoughts on radios / communication hardware?

1 Like

Indeed they are set to a relatively high value (see here), but I think they aren’t excessively high. In our experience, using position control only is not enough for accurate trajectory following because in this case the controller actually uses zero velocity and acceleration as a setpoint so it’s always lagging behind the true position where we want it to be. Using position and velocity control at the same time improves the trajectory tracking significantly. Adding acceleration targets did not yield significant improvements in simulation, so although we have the code for it, we don’t use it in practice. (This might change in the future, though).

1 Like

When doing shows the drones do not need to communicate with each other as long as there is no emergency situation when you would suddenly need to bring all of them home. However, when we are wearing our research hat (see here), we do have a communication network between the drones and they can adapt to each other to achieve collective flocking, collision avoidance or dense drone traffic in a shared airspace. For shows, you can usually do just fine without any communication as long as everything goes according to plan.

As for the communication between the swarm and the GCS, we have standard 2.4 GHz wifi (basically UDP-to-serial converters attached to a telemetry port on the drones) and a secondary fallback channel built on SiK radios. You can probably substitute wifi with XBee radios as well (or you could substitute the SiK radios with XBee, but then you have two comm channels that operate in the same frequency band so it’s probably not as robust as using two separate frequency bands).

There’s also the option of using a companion computer with 4G + VPN connection, we haven’t tried that as we’ve found that 4G networks don’t work that well in large public events when lots of people gather in the same area.

2 Likes

That’s interesting that you set both velocity and position. I always thought that setting just velocity would be enough for good control; in my case I see that setting velocity and acceleration gives some encouraging results but my testing is still ongoing on this matter.
Anyway, thanks very much for the reply and I look forward to seeing a massive Arducopter drone show in the near future! :smiley:

In theory that could work; we have a swarm used for research purposes, and the companion computer of that swarm sends velocity control commands only to the autopilot in guided mode. The key difference compared to the show mode is that in the show mode, we simply send the desired position and the desired velocity, and let the controller in ArduPilot work out how to adjust the velocity vector to take into account the difference between the desired position and the current position if the drone is lagging behind. In our research swarm, the drones interact with each other, their target positions and also the (virtual) obstacles and, and we do all the summing of the velocity vector components arising from the various interactions in the companion computer before sending the desired velocity vector to ArduPilot.

3 Likes

Awesome! Thank you for sharing. Looking forward to trying this.

Thank you for sharing this info!
I am trying the steps through with a windows pc, I was able to get skybrush live to connect to the skybrush server (running on powershell). And I am running the “poetry run ap-sitl-swarm” SITL with the given parameters (on linux WSL) and I am having a hard time getting the server connection established and for the drones to show up on live. The message I am getting is :

9 | Setting SIM_SPEEDUP=1.000000
9 | Suggested EK3_BCOEF_* = 16.288, EK3_MCOEF = 0.209
9 | Home: -35.363201 149.165288 alt=584.000000m hdg=353.000000
9 | Starting sketch ‘ArduCopter’
9 | Starting SITL input
9 | Using Irlock at port : 9085
9 | UDP connection 127.0.0.1:14xxx
9 | Loaded defaults from /tmp/sitl-swarm-5swm_sb7/drones/009/default.param
9 | Smoothing reset at 0.001
9 | UDP multicast connection 2xx.2xx.xx.xx:14xxx
9 | multicast bind failed on port 14555 - Cannot assign requested address
9 | Subprocess exited with code 1

It seems that the poetry run command works but seems to be the problem with connection. I’ve tried disabling firewall just for WSL so far and no luck.
Any ideas how I could get around this on Windows would be appreciated :slight_smile:

Try changing the value of the multicast_address variable in src/ap_swarm_launcher/cli/main.py to 127.0.0.1:14555. It seems like “real” multicast addresses are not supported in WSL. Not sure if this change will work, but it’s worth a try. In the worst case you won’t get broadcast traffic from Skybrush Live on the simulated drones, but I think it shouldn’t prevent you from uploading and starting a show.

Please let us know whether the trick worked.

May I ask how you time sync the drones?
You have explained that you have a patched firmware that enables the “drone show mode” and you update the targets more than 10 times a second, but this only solves the issue of the drones following the path at the time intervals that the path tells.
How do you ensure that the internal clock does not drift or how do you ensure that all drones start at the same time?
BTW congratulations on the amazing piece of software that you develop.

All drones are using GPS receivers for positioning. The GPS signal is a very precise time signal by nature and could be used for synchronization too. More can be found here.

The response of @VRquaeler is correct (and sorry for the late reply, I wasn’t following this thread recently). The GPS receiver gets a very accurate time signal and these timestamps are sent to the autopilot so the autopilot can sync its internal clock to GPS time. This is good enough for most show purposes, although I have to mention that the timesync is not 100% accurate – if you do fast synchronized flashes, you might notice that there is a bit of a time difference between individual drones. This does not mean that the GPS time signal is not accurate but that there is a delay on the serial line between the GPS and the autopilot, and this delay is not the same across drones. For a more accurate timesync one would need to use the 1PPS signal from the GPS receiver, which provides a pulse once every second, right at the start of the second. One of the projects I’m doing on the side is to figure out how to use the 1PPS signal to synchronize the clocks even more accurately.

2 Likes

Thank you so much for the reply. A lot of information.

This is really fascinating! Thanks for sharing the info :smile:

I notice that the Skybrush codes maintenance is not accurate to the Ardupilot mainstream. for example.

  1. ReleaseNotes.txt in the Skybrush stated is 4.3.6 , however, the version.h stated is 4.4.0-dev
  2. AP_Arming.cpp line 257 is different from the mainstream 4.3.6.
  3. there are more files different compared to the mainstream stable 4.3.6 version.

so, my question is which port/modification Skybrush should be based on to work well with Skybrush Live or Server? the reason I asked is that most non-beta testers use stable versions and configure/tune the parameters to achieve stable and performance flight. the firmware posted is based on the under-developing or 4.3.6 stable? but, if I based on this, they are identical not modified. If I based on this and this, they are different.

the other confusing part is that Skybrush mentioned must use their Arducopter firmware or actually users can just use the Ardupilot apj file and just change a few of the parameters according to the manual?