Servers by jDrones

Indoor autonomous flight with Arducopter, ROS and Aruco Boards Detection


(Subodh Mishra) #17

Thanks I will try this.


(ppoirier) #18

Here is a simple pipeline that works
gst-launch-1.0 rtspsrc location=rtsp://192.168.99.1/media/stream2 ! decodebin ! autovideosink

Problem is that it is transmitting hi-res = too much delay , we need strip down

This one is a copy of Mission Planner = 230 Millisec
gst-launch-1.0 rtspsrc location=rtsp://192.168.99.1/media/stream2 buffer-mode=1 latency=100 ! application/x-rtp ! rtph264depay ! avdec_h264 ! videoconvert ! video/x-raw,format=BGRA ! autovideosink


(Subodh Mishra) #19

I am trying to use use the ros driver gscam to do this.


(Andrea Belloni) #20

@ppoirier and @tiffo I don’t know if all the step delineated by @ppoirier will become video tutorials anyway I will be available to provide further details.

Beginning with gstreamer over WiFi part:
I’am using a dedicated router as access point, with only my desktop PC on ethernet and the Raspberry Pi as WiFi client. From my test this configuration gives less latency then that in which the Raspberry Pi is configured as access point and the desktop PC connect via USB-WiFi device.

On The Raspberry Pi I launch this gstreamer pipeline:
gst-launch-1.0 -v rpicamsrc bitrate=10000000 rotation=180 exposure-mode=10 awb-mode=0 awb-gain-red=1 awb-gain-blue=2 iso=800 shutter-speed=10000 contrast=50 ! "image/jpeg,width=640,height=480,framerate=30/1" ! udpsink host=192.168.10.16 port=9000

rotation=180 is due to the way my cam is mounted and the other parameter serves to have fixed value for exposure and gain and high contrast that gives a better marker detection.

On desktop PC gscam ROS node has the following parameter to receive the stream:
<env name=“GSCAM_CONFIG” value=“udpsrc port=9000 ! jpegdec ! videoconvert”/>

This gives me a delay on near 80ms.


Tuning Loiter on copter using external position estimate (i.e. NonGPS)
(Tiziano Fiorenzani) #21

Thanks Andrea. Another thing I’d like to really narrow down in one video is how to capture the video and then pipe it to both the on board OpenCv and the streamer. And then stream also the modified frames from the OpenCv. That I think would be a nice step one


(Geert Folkertsma) #22

Would it be possible to run aruco board detection on the RPi (or different small companion computer) onboard the quad?


(Andrea Belloni) #23

Ciao Tiziano
it is an interesting thing but is a little bit different from what I have done in my system. if I understood correctly you would like to capture, process and stream the video directly with OpenCV without using the ROS and I have no experience in this.
This thing could be done with ROS using usb_cam or gscam to capture, cv_bridge to convert ROS images into OpenCV images, and vice versa and gscam to stream using gstreamer pipelines.


(Andrea Belloni) #24

Not using Raspberry Pi 0, the one I use, maybe using Raspberry Pi 3 but I have not tested.
In my system I have tried to do all by using minimal and cheap hardware, so I use my desktop PC to do more heavy processing. This anyway has the advantage to simplify the testing of other visual pose estimation package. Working on PC is way more easier and speedier then working on embedded board.


(Stacky) #25

Is the RPi 0 mounted on the quadcopter.
And if it is mounted on it, would it be possible to use it fully as the receiving end with OpenCV installed on it so it can send the commands wirelessly back to the quad?


(Andrea Belloni) #26

Yes the Raspberry Pi 0 is mounted on the quadcopter. On first post is all explained.
The Raspberry Pi 0 has not enough computing resources to do the pose estimation on board (at least with the method I am using) so the pose estimation is done on desktop PC and then the results is transmitted to the quadcopter.

Now I am doing the same thing but on a slightly bigger quad (160mm) with a Raspberry Pi 3 on board. In this way I am able to do all the calculations stuff on the quadcopter without using the desktop PC, I am using it only for visualization with rviz.

When I organize better all the steps I will post all the info relative to this new setup.


(ptroll) #27

Hi everyone!

I’m working on positioning of a drone indoor, for this, I followed the published description. http://ardupilot.org/dev/docs/ros-aruco-detection.html

However, during the test flight - which is a straight rise to 1 meter and then landing - the following strongly fluctuating behavior was experienced.
https://photos.app.goo.gl/qrNmJD3KZtkxjtDX8

I’m using a raspberry pi 3 B+ with ubuntu 16.04 (from ubiquity) + MAVROS
I have a Pixhawk 4 controller with Arducopter 3.7.0-dev stack.
The camera is a raspicamera v2.
In rviz i can see the realtime position from the marker using mavros/vision/pose
But I realized that i don’t get any data from /mavros/local_position/pose.

The parameters what i use are the same as in the article:

AHRS_EKF_TYPE 2
BRD_RTC_TYPES 1
EKF2_ENABLE 1
EKF3_ENABLE 0
EK2_GPS_TYPE 3
EK2_POSNE_M_NSE 0.1
EK2_VELD_M_NSE 0.1
EK2_VELNE_M_NSE 0.1
EK2_EXTNAV_DELAY 80
GPS_TYPE 0
COMPASS_USE 0
COMPASS_USE2 0
COMPASS_USE3 0
SERIAL1_BAUD 921 (the serial port used to connect to Raspberry Pi)
SERIAL1_PROTOCOL 2
SYSID_MYGCS 1 (to accept control from mavros)
VISO_TYPE 0

Does anyone have any idea what can cause this problem?


(Kilo November Delta) #28

Hello @ptroll, is it possible to have the logs of this flight ?


(Andrea Belloni) #29

Hi @ptroll I would like to see the output of rostopic list after the launch of mavros.


(ptroll) #30

Hi Anbello!

i attached the output of rostopic list after the launch of mavros apm.launch

rostopic list.txt (2.8 KB)


(Andrea Belloni) #31

@ptroll sorry, I would like to see also the output in the terminal in which you launch mavros.


(ptroll) #32

No problem, here you have it
mavros launch output.txt (12.6 KB)


(Andrea Belloni) #33

I don’t see anything wrong …

at this time after launching mavros if you give the following command:
rosrun aruco_gridboard set_origin.py
you should see messages from /mavros/local_position/pose topic

if this doesn’t happen try to reboot the FC via GCS (I use mavproxy connected to gcs_url) and after seeing the messages (on GCS or term tab with mavros running):
[ERROR] [1455211795.669665982]: FCU: GPS Glitch
[ERROR] [1455211795.760278673]: FCU: GPS Glitch cleared
give the command:
rosrun aruco_gridboard set_origin.py
again


(ptroll) #34

Dear Anbello,

I tried what you suggested, but I don’t receive any message from /mavros/local_position/pose topic. (Meanwhile /mavros/vision_pose/pose works fine)

After I launched the aruco_gridboard detection_rpicam.launch , mavros pm.launch , rosrun aruco_gridboard set_origin.py (just after GPS Glitch cleared)
I gave the command: rosrun aruco_gridboard mavros_control1.py
And then i saw the following graph in rqt_graph:

It seems the /mavros/local_position/pose topic doesn’t get any source from /mavros
I can’t find the reason for this.
Does anyone have any tips on where the error might be?
What should i modify to get messages from local_position topic? Maybe something in apm_config.yaml or apm_pluginlists.yaml?


(ppoirier) #35

The local position is internal to the EKF on ArduPilot, there is not external position estimator , so mavros is basically just used to transform ENU to NED references. You can still use the tools for mav control.


(Kilo November Delta) #36

Just an idea, but if you are using the branch Copter-3.7.0-dev stack, maybe you should use a “released” branch like Copter-3.6.7. This one has the external navigation feature, and you are sure to not suffer from any bug appearing during development.
Also, it can happens that some topics (maybe /mavros/local_position/pose is part of it), need a frequency parameter to publish. Not sure, but I had a bug like this once. I use the service rosservice call /mavros/set_stream_rate 0 20(Hz) 1 for this.