I am currently using jd-RF900Plus for telemetry from Pixhawk 4. And If I use another one to communicate with Nano, they might interfere. I have will have to look for another alternative.
it’s stuck on MAV> link 1 down.
I’ve made the setup GND, Rx-TX and vice versa. But no response from pixhawk. What else do I need to do with /dev/ttyTHS1 port?
Can I do this without a serial converter? I already have a wifi module on USB port and I intend to attach two cameras for stereo vision. Adding another USB device would take up bandwidth. (will this be a issue since it’s usb 3.0)
Are you able to read any data from Flight Controller using a simple tool like microcom ?
If you connect to a Mavlink enable port you should be able to read some readeable data mixed with binary data.
Ahh OK I looked at your site. I see you can connect with dronekit.
The most striking issue I see is here: Autopilot Firmware version: APM:Copter-3.4.6
First of all, upgrade the firmware to latest (4.0.2 or more recent)
And then we can start talking
Hi snitchai,
what leads you to the conclusion that my project is not ready to implement and test? I have it up and running along with a full implementation of the original Redtail project on Arducopter plus a bunch of enhancements.
In case you ran into any problems please open an issue in my git.
I created it for the TX2 but nothing in the code should prevent it from running on a Nano.
When you can communicate with Arducopter through mavlink and even arm the drone as you described in your paper, then the issue is likely with the drone. I suggest to first verify your logs for any prearm check failures.
mtbsteve, thanks for letting me know that it works for jetson-nano, initially( couple of months ago) i started with redtail but was unsuccessful may be because of my ignorance, it would be a great help if you could add couple of lines on github about any additional requirements/configuration for jetson nano or a small implementation document for jetson nano.
I will first update firmware to 4.0+, rerun and will verify pre arm logs , i sure will communicate back all successful/unsuccessful events. Once again thanks for pointing me in correct direction.
You wrote in your documentation that the communication is working. If your wiring was wrong, you would not receive any results from eg your mavproxy scripts. I have no clue what you are trying to do with those UDACIDRONE packages.
Test it with some simple mavlink commands as explained eg here and look at the Arducopter logs on the Flightcontroller response. http://ardupilot.github.io/MAVProxy/html/index.html
@LuckyBird, @ppoirier
Hi, i would like to know whether jetson nano can be used with arducopter for obstacle avoidance. The wiki of arducopter suggests to use openkai for obstacle avoidance which is not compatible with NVIDIA Jetson Nano. Can you please guide/help in this regard.
Thankyou in advance
Regards
Yasir Khizar
Hello,
Object avoidance can be achieved with or without a companion computer as described on the wiki. The NANO is one option if you want to experiment with some basic avoidance but it is lacking power for more advanced stuff
@ppoirier, i am working on obstacle avoidance and the idea is to use zed camera as a mean for depth sensing configured with nvidia jetson card running openkai. Using this methodology, i think companion computer is a must. If i am right, can u please guide regarding following:
Can we use jetson Nano or we have to configure jetson tx 2 as mentioned in wiki.
Is this methodology i.e. zed-jetson-open kai a robust way for obstacle detection/avoidance.
Actually i have found videos using this methodology but the copter is moving at slow speeds. I have concerns and would like to know whether it will work for high speeds and distance measurement would be accurate or not.
Thanku
Regards
Yasir Khizar
Open-kai is a sponsor of Randy Mackay @rmackay9 , he might comment on the status of development. Zed Camera requires quite a lot of processing and this is why they use the TX2.
This is why I prefer the Intel T265 device as it has a lot of power inside (Movidius Visual Processor and ASIC) and requires just a RPI4 to get almost the same result.