Topic:
Rapid Response UAV network for Disaster Relief and Survivor Location using Machine Learning and GPS.
Proposal type: Hardware [X ] , Software , Other : _________________
Description:
Background:
Disasters, whether natural or man-made, can have devastating impacts on communities, resulting in loss of life, injuries, and widespread damage. In the aftermath of such events, rapid response and effective search and rescue operations are crucial to saving lives and providing necessary relief to affected individuals. Traditional methods of locating survivors often involve extensive ground searches, which can be time-consuming and inefficient, particularly in large or difficult-to-access areas.
In recent years, advancements in unmanned aerial vehicle (UAV) technology have opened up new possibilities for enhancing disaster response efforts. UAVs equipped with advanced cameras, sensors, ML-systems, real-time communication capabilities, and autonomous navigation systems can significantly improve the speed and accuracy of locating individuals in distress. By leveraging aerial views, these UAVs can cover vast areas quickly, identifying survivors, assessing damage, and guiding emergency response teams to critical locations.
However, the effective deployment of UAVs in disaster scenarios requires careful consideration of various factors, including environmental challenges, regulatory compliance, and integration with existing emergency response frameworks. As such, there is a pressing need for a dedicated project focused on creating a UAV specifically designed to address these challenges.
Purpose:
This project is designed to streamline the process of finding individuals in a disaster-stricken area. This will allow responders to administer care faster, and the drone will be able to administer a small amount. Our project will be incorporating Machine Learning, in the forms of audio and infrared object detection. Incorporating more advanced forms of technology will help to save more lives, and will raise awareness on how technology can be used in ways to save lives.
Our drone uses an infrared camera to detect humans at night, allowing detection even if it is pitch black. On top of this, our project innovates by using machine learning on the camera feed, specifically object detection. The feed is sent back to the ground station computer running the model, allowing for fast and accurate detection. Our project also utilizes machine learning in the form of audio. Audio feed will be transmitted over radio to the computer, running another model in tandem that is able to detect human voices. This allows for an even tighter sweep for signs of humans. The project runs using GPS location to be able to accurately pinpoint the location of any humans. Our project will incorporate a bracelet that sends GPS signals for drones to help locate and find.
Thus, our project goal is to Create an unmanned aerial vehicle network capable of streamlining the entire disaster relief and survivor location process for hurricanes that can be managed by a single person.
Project Criteria
Drone must be able to reach speeds of 50KPH
Drone must be able to transmit video and audio
Drone must be able to detect 99% of humans through all tests over a certain area
Audio and Video ML models must have over 95% accuracy when tested
Machine learning inference time must be under 100 milliseconds for both video and audio
The project must run in real-time
Project Constraints
Budget: $1,000 (For affordability)
Project must be finished by January 21, 2025
Abide by Synopsys Safety Protocols as well as ours:
Safety Protocols (IMPORTANT):
Propellers will be OFF the running motors while a person is handling the drone or in a vicinity of 5 meters
Lithium Ion Batteries will be kept away from any heat over 50 degrees Celsius and will be 5 meters away from any water.
Drone will only be flown in FAA-regulated areas. Specifically Sunnyvale Baylands Park. Places that cannot be used for drone flying include anywhere near major airways or airports.
VTX power will be kept under 25 milliwatts
There is ALWAYS a RemoteID module compliant with the FAA any time the drone is flying.
Project has to fly in Baylands Park area that allows UAVs
A specific area is designated in Bayland’s Park that lets UAVs fly in them
Materials
- FC ESC flight stack Link (x1) ($72)
- SourceOne Frame Link (x1) ($30)
- GPS+Compass Module Link (x1) ($20)
- RadioMaster RP4 Receiver Link (x1) ($25)
- RadioMaster Pocket TX ELRS Link (x1) ($65)
- AKK Ultimate Mini VTX Link (x1) (Already Owned)
- Foxeer Micro Link Link (x1) (Already Owned)
- Battery Link (x2) ($86)
- RemoteID compliant module Link ($32)
- DJI Integra Link ($350)
- RunCam Night Eagle HD Link ($150)
- Racer Star br2207s Link (x4) (Already owned)
- Props (Already owned)
- Raspberry Pi 4 (Already owned)
Planned amount $$ 830(USD)
Build Procedures
Assemble the frame using provided screws and standoffs following the instruction manual in the kit.
Connect standoffs provided to the four underside corner screw holes of the flight controller using screws provided in the kit.
Connect the electronic speed controller(ESC) under the flight controller standoffs using screws provided in the kit (same screw size and shape)
Connect the ESC wire given in the kit to the ESC port on the flight controller as shown in the diagram. Connect the battery to check for smoke.
Solder Analog VTX module to FC as shown in the diagram.
Flash Ardupilot copter firmware onto FC using the Ardupilot flasher as described in the docs.
Download, install, and run Mission Planner configurator as described in the docs.
Configure the VTX on the correct baud rate and channel as described in the docs.
Solder the thermal camera to the FC as described in the diagram above.
Make sure the camera and VTX work by testing OSD on VRX as described in the docs.
Connect Runcam Link module to Flight Controller and attach camera.
Bind DJI Integra Goggles to Runcam Link Module
Setup Cosmostreamer on RPI 4
Connect Cosmostreamer to computer, allowing for camera stream.
Solder ELRS receiver to the drone as shown in the diagram above.
Configure ELRS through the ELRS configurator with binding phrase through wifi as described in the docs
Bind ELRS receiver to Radiomaster Boxer with the binding phrase and configurator as described in docs.
Solder GPS module to FC as shown in the diagram above.
Set up the GPS module using mission planner software as described in the docs.
Configure the ELRS receiver and transmitter to use MAVLink as described in the docs.
Set up ELRS Backpack to communicate with Mission Planner as described in the docs.
Find a dataset with aerial views of people, preferably over 15k images on RoboFlow.
Train YOLOV8 Transfer learning model on data on RoboFlow.
Find a dataset with audio of people screaming on RoboFlow.
Train a model that segments audio of people screaming out of background noise RoboFlow. (Propwash, rushing wind)
Setup camera feed on VRX to become a camera input to the computer running Mission Planner as described in the docs.
Setup MAVProxy python module for Mission Planner as described in the docs.
Write a Python script that uses the YOLOV8 Pytorch model and audio detection to send a ping to a StreamLit webpage when they detect a human as described in the docs. It should be able to control waypoints through MAVLink telemetry and dronekit.
Set up channels for RC Controller through Arduflight Configurator as described in the docs.
Configure motor direction and index (Props Out) as described in the docs.
Install drone kit SITL to simulate drone swarming as described in the docs.
The project will be thoroughly tested, and results will be formatted on a board and presented at Synopsys Science Fair.
Testing Procedures:
Test that the drone properly arms and responds to radio control.
Test that the propeller direction is correct (IMPORTANT: TEST WITHOUT PROPELLERS)
Test that the drone can fly with radio control (From the RadioMaster Boxxer TX)
Test that the drone can send feed to a video camera
Test the drone GPS using mission planner and a simple pattern.
Test the range of the VTX and RX modules.
Test the ML camera stream for successful video transfer.
Test the ML audio stream for successful video transfer.
Test the drone with multiple testing runs using a variance of people. All tests will follow a predefined path and there will be 3 days of 20 testing runs each to get 60 results with variance of weather.
Test the drone remote arming
This project would show a fully fleshed out way of how ardupilot copter could be used. It covers most of the main features of Ardupilot including Ardupilot autonomous and Mission Planner scripting. It shows how versatile the Ardupilot firmware is as well.
Skills
Our team went to the national science fair last year from our local and subsequent state fairs. We incorporated machine learning, so we have skill in our area. One of our members has had experience building drones, and currently has 4 of them, completely self built.
Milestones: Machine Learning models done by December 31st