Thank you .
I had follow up question ,
- how do you remote login into Nano , do you ssh or any other method and to that do you use a wifi network outwards ?
- Do you run your algorithm on Nano everytime you start PH2 or you programmed Nano to run your program on boot.
Yes you can ssh through wifi or using 900Mhz telemetry radio or bluetooth to the TTY console port
I am currently using jd-RF900Plus for telemetry from Pixhawk 4. And If I use another one to communicate with Nano, they might interfere. I have will have to look for another alternative.
But thanks for your answering my questions.
How to run mavproxy autostart?
Hi! You should use JGND J8 J10 pinouts for UART connetction.
mavproxy.py --master=/dev/ttyTHS1 --baudrate 921600 --aircraft MyCopter
it’s stuck on MAV> link 1 down.
I’ve made the setup GND, Rx-TX and vice versa. But no response from pixhawk. What else do I need to do with /dev/ttyTHS1 port?
Can I do this without a serial converter? I already have a wifi module on USB port and I intend to attach two cameras for stereo vision. Adding another USB device would take up bandwidth. (will this be a issue since it’s usb 3.0)
As quoted above you can use the onboard serial port
Are you able to read any data from Flight Controller using a simple tool like microcom ?
If you connect to a Mavlink enable port you should be able to read some readeable data mixed with binary data.
Ahh OK I looked at your site. I see you can connect with dronekit.
The most striking issue I see is here:
Autopilot Firmware version: APM:Copter-3.4.6
First of all, upgrade the firmware to latest (4.0.2 or more recent)
And then we can start talking
Thanks ppoirier, i will update the firmware to 4.0.2
what leads you to the conclusion that my project is not ready to implement and test? I have it up and running along with a full implementation of the original Redtail project on Arducopter plus a bunch of enhancements.
In case you ran into any problems please open an issue in my git.
I created it for the TX2 but nothing in the code should prevent it from running on a Nano.
When you can communicate with Arducopter through mavlink and even arm the drone as you described in your paper, then the issue is likely with the drone. I suggest to first verify your logs for any prearm check failures.
mtbsteve, thanks for letting me know that it works for jetson-nano, initially( couple of months ago) i started with redtail but was unsuccessful may be because of my ignorance, it would be a great help if you could add couple of lines on github about any additional requirements/configuration for jetson nano or a small implementation document for jetson nano.
I will first update firmware to 4.0+, rerun and will verify pre arm logs , i sure will communicate back all successful/unsuccessful events. Once again thanks for pointing me in correct direction.
mtbsteve , do you think there is a connection problem between nano and pixhawk? Are the pins correctly connected as shown in my document?
You wrote in your documentation that the communication is working. If your wiring was wrong, you would not receive any results from eg your mavproxy scripts. I have no clue what you are trying to do with those UDACIDRONE packages.
Test it with some simple mavlink commands as explained eg here and look at the Arducopter logs on the Flightcontroller response.
Thanks ppoirier, I have upgraded from Autopilot Firmware version: APM:Copter-3.4.6 to Autopilot Firmware version: APM:Copter-4.0.1.
Every thing is working as expected , i have uploaded results on the following linkconnecting jetson-nano to pixhawk
Hi, i would like to know whether jetson nano can be used with arducopter for obstacle avoidance. The wiki of arducopter suggests to use openkai for obstacle avoidance which is not compatible with NVIDIA Jetson Nano. Can you please guide/help in this regard.
Thankyou in advance
Object avoidance can be achieved with or without a companion computer as described on the wiki. The NANO is one option if you want to experiment with some basic avoidance but it is lacking power for more advanced stuff
@ppoirier, i am working on obstacle avoidance and the idea is to use zed camera as a mean for depth sensing configured with nvidia jetson card running openkai. Using this methodology, i think companion computer is a must. If i am right, can u please guide regarding following:
- Can we use jetson Nano or we have to configure jetson tx 2 as mentioned in wiki.
- Is this methodology i.e. zed-jetson-open kai a robust way for obstacle detection/avoidance.
Actually i have found videos using this methodology but the copter is moving at slow speeds. I have concerns and would like to know whether it will work for high speeds and distance measurement would be accurate or not.
Open-kai is a sponsor of Randy Mackay @rmackay9 , he might comment on the status of development. Zed Camera requires quite a lot of processing and this is why they use the TX2.
This is why I prefer the Intel T265 device as it has a lot of power inside (Movidius Visual Processor and ASIC) and requires just a RPI4 to get almost the same result.
@ppoirier, thanks for your guidance. Can you please guide regarding obstacle avoidance using intel T265 with raspberry pi 4.
- What firmwares to be used in Intel T265 and raspberry Pi 4?
- How to configure with arducopter 4.0.2?
- How much will this be robust (at high speeds) for copter’s obstacle avoidance as the distance measurement is max 10-12 m?