Hello, forum users! I am developing an Autonomous drone based on Intel NeuralComputer Stick 2 + Raspberry Pi 4 + nVidia Jetson Nano. It all looks something like this (now a laser as a weapon ):
a Cluster of two minicomputers is the top-level brains. I purchased Arducopter 2.8 for the role of flight controller. I will not have a remote control , the task will be given at the start point either by wire or by wi-fi. Can anyone quickly tell you how it’s easier to control an Arducopter?..
Arducopter is firmware but if by “Arducopter 2.8” you mean an APM 2.8 Flight controller those are obsolete. It would be a mistake to start a project with one of those.
If you mean the 8-bit architecture of the flight controller, we don 't care. The main thing is that he copes with his work with a large margin. We have a variant with an Arduino DUE-based controller with adapted Multiwii firmware and separate sensors. But it looks very “scary” (in comparison):
for us, the top-level management system is much more important. And she can do a lot. For example, tracking up to 200 goals online:
Don’t underestimate the fly controler, that’s the one that will keep you companion computer alive…
And if you want to implement machine learning, you need latest features of the 4.x ArduPilot firmware release not the 5 years old code that your outdated flight controler is limited to
Yes, I understand that the flight controller is also important. But machine learning is already implemented in RPi and JN. In General, here is an advertising brochure for participation in one of the local exhibitions (sorry for the machine translation):
Autonomous multi-purpose operational and tactical strike drone
(top-level management system)
Attention! The project is being developed exclusively for use in game shooters such as tactical and team paintball and laser tag, including those involving UAVs (aerial combat). Any Parallels and speculation about possible alternative uses are incorrect.
Exhibition restrictions: batteries removed (powered by a stationary unit), laser installations deactivated.
- Project Manager: – (eng. by CCU)
- Participation in the project: – (9th grade teacher)
In terms of competition activities, the project is being implemented for participation in all-Russian thematic exhibitions/competitions, and accreditation in competitions of Intel and nVidia corporations has been confirmed.
An Autonomous multi-purpose operational and tactical strike drone is designed to independently perform tasks (without external control) of an operational and tactical nature in the area designated at the start point.
Performing tasks-only in offline mode, including as part of a swarm of drones (collective AI), using artificial intelligence algorithms of the second kind (according To Hintze) with neural network processing of input data.
Main performance characteristics
- Practical ceiling (without hanging equipment) - 3600 m n. o.m.
- Maximum goriz. speed (without suspension equipment) – 50 km/h.
- The maximum range of Autonomous task development is 12 km.
- Real-time identification and tracking of up to 200 different types of goals simultaneously.
- Suspension (configuration) - controlled and unmanaged, passive, active and reactive spec. tools (for paintball and laser tag).
- Equipment option:
- pulse high-energy installation (laser spark) - for laser tag and (or) suppression of optical and infrared detection and guidance systems of ground vehicles and UAVs, as well as building a 3D map of obstacles.
- The main on-Board computer hardware:
- a cluster of two onboard computers with hardware accelerators of neural networks, the total computing power is 2.1 TFlops (FP16):
- NVIDIA Maxwell 128-core processor with support for parallel computing technology CUDA, power-500 GFlops;
- accelerator of convolutional neural networks based on Intel myriad x vector processor (16 tensor cores-vectors), power-800 GFlops;
- convolutional neural network accelerator based on the Gyrfalcon matrix processor (> 28000 VHF). matrix), power-800 GFlops.
- The implementation of AI:
- neural networks for recognition, categorization, object detection, and segmentation;
- neural network part of the decision-making system;
- part of a decision-making system based on clear algorithms;
- high-level adaptive model (depending on the conditions - Autonomous decision-making on the use of necessary neural networks).
- Interfaces used:
- Serial, SPI, Ethernet, I2C, PWM, discrete logic signals.
I think that DroneKit we will not do without for our task. I think I’ll have to dig into the ARM source code to adapt it. Communication via UART with Raspi is also under great question - they have a reduced UART there, so that there are no bugs, you need to reduce the frequency of the processor. And we can’t do that. In addition, I do not want to completely emulate the exchange via UART. Ideally, we need no more than 10 simple commands (takeoff, left-right, top-bottom), and so on. And all this should be controlled by the top-level brains of the drone itself. I read here on the site there are instructions for communication with Raspi, I think this is not our case… Has anyone worked with the firmware source code? Which way to dig?..
Implementing companion computer with RPI or nvidia is pretty common and relatively well documented in the wiki
Please note that we do not encourage weaponisation of ArduPilot systems and flying such vehicle configuration is forbidden in many countries. Having said that, it should be interesting to compare the acceleration between the myriad and the Gyrfalcon.
I looked through all these links. It uses UART, but starting (if I remember correctly) with RPi2+, for stable operation of the UART, it is necessary to reduce the processor clock speed. And we can’t do that - too much is tied up in calculations.
As for Jetson, it is already heavily loaded (for example, the Chinese neural stick with gyrfalcon on Board is also connected to It), so communication with the flight controller is assumed via RPi. As for the performance and comparison tests of NCS2 and NS gyrfalcon, I only ran the same classification networks from the Tensorflow package. The results are approximately the same - on an image of 800 x 800, the processing rate is 20-25 frames per second. I didn’t do any direct computational tests, but here I “practically” believe Intel with their stated 800 GFLOPS.
As for “weapons” - I think it is not necessary to go to the point of absurdity - in the end, a toothpick can become a weapon under certain conditions
As for serial communication with the RPI 3B+ and the 4, you need to disable the bluetooth to recover the ‘‘true’’ Uart in order to get higher speed (250kbps and above). As written here:
disable-bt disables the Bluetooth device and restores UART0/ttyAMA0 to GPIOs 14 and 15. It is also necessary to disable the system service that initialises the modem so it doesn’t use the UART:
sudo systemctl disable hciuart .
Here is for this huge thank you ! The fact is that we haven 't tried working with a serial RPI port ourselves , precisely because we read about communication problems. But if this is true , then “half” of the problem is solved. It remains to decide how not to emulate the full exchange Protocol. We’ll try all this on the weekend , since it’s a hobby for me , and I’m actually in my main job now. A student of the 9th grade is also working with me on the project (he is in school now )…
As described in the wiki above, using dronekit python
based on MavLink control protocol, really makes interfacing theFlight Controler a fast and easy task.
Once again I recommend that you upgrade your existing FC to a more capable like a PixHawk or a Cube (depending on budget) because you have good chances to get stuck in incompatibility or lack of features.
Let’s think… It 's not so much about money as waiting for the same Pixhawk again. In General, we have under the ArDUE worked without any serial, - the digital inputs from rpi are fed the corresponding signal and that’s it. At the same time, of course, I had to sweat with the Multiwii firmware . So I think, is it possible to adapt the firmware here as well, so that instead of controlling the suspensions, let 's say, these signals can be used as we need?.. Everything else can (and should) be managed directly with RPi and JN. Tonight I will show you on video how it is happening now…
here is such a video turned out-half of the brain works - only RPi + NCS2. Jetson doesn’t work because it quickly overheats without active cooling. This is without a flight controller. If it were possible, we would also hang the management on computers. However, for correct operation of the drone control, the program cycle time is 2-3 MS, while “normal” Linux OS gives at least 10 MS. This is too much. It’s also unstable. If you use something like QNX (or Linux RTOS), then it might be useful. Although it is unlikely…:
the Guys who deal with this topic contacted me, now they have external management, we will do offline together…:
Will this one do? (perhaps, too ):
Actually, we have a lot of research areas, including flight controllers. For example, a colleague is currently working on the author’s project of a controller based on SoC = FPGA + ARM (Cyclone V + 2 core Cortex A9). On this basis:
I have already implemented Feedback with sensors, and now I am “suffering” with the control of brushless motors. There, the main problem is the selection of a control sawtooth signal for switching field-effect transistors…
Yes, most of the pixhawk clones are OK, but there are some report of marginal sensors that are shipped with these units.
As for the Cyclone Dev board, it reminds me of the Intel Aero FPGA integration;
Yes, Intel-aero reminds… But there is a fundamental difference - judging by the description:
you can configure only some standard interfaces, including those for controlling engine controllers (ESC). However, our regulatory functions are integrated into the General system (SoC). At the outputs, we only have keys in the form of field-effect transistors.
By the way, it is now “fashionable” to do everything on one chip. For example, DARPA a few years ago announced a tender for a target recognition system for the F-35 on a single chip (SoC). By the way, according to one of the requirements for this system, our project exceeds the stated parameters. DARPA requires a specific performance of 25-50 GFLOPS/W, we have 70
… However, this is the only requirement from the set that our project meets…
it Seems like a lot of good reviews…
While we are all sitting massively on a week-long quarantine for the main job (and maybe more - as the situation will develop), I decided to play with ardupilot. Especially since Pixhawk is generally stuck on shipping in China. First of all, I disassembled and looked at what is inside the Chinese clone. Quite normal soldering (only some places on the back of the contact groups are not washed off the flux).
“Good old” AVR chip atmega 2560, a set of sensors. Everything seems to be in place. One of the unpleasant features of the design-the Board in the outer case is dangling like a “snot in a glass”, fixed only by a piece of foam rubber laid between it and the top cover. The mounting holes for the Board screws do not match the housing mounting holes and are not used in any way. And the mounting structure, in which the mounting screws for the housing cover, one of which is twice as large as the others - is generally a “masterpiece”. I hope, so only in the clone, and in the original all “on the mind”?..
I looked at projects for GSOC here.
In principle, all of them are on the surface, and, so to speak, “far-fetched”. In fact, what does the concept of “on one chip” suggest? It involves improving some functions in smaller weight and size indicators. This is the main thing. We are currently working on overall reliability, which is why we are thinking of creating two independent control channels. And having this capability is a really good and important task for SoC…