APSync with APStreamline (BETA)!

We now have a new version of APSync with APStreamline built in for the Raspberry Pi! This version of APSync is built with the latest version of Raspbian (June 2018) and is now compatible with the Raspberry Pi 3B+ out of the box.

APStreamline adds several improvements for live video streaming from UAVs and other ArduPilot robots. It is directly available under the ‘Video Streaming’ tab in the APWeb config page at 10.0.1.128:80 on the Companion Computer. APStreamline has the following features:

  • Automatic quality selection based on bandwidth and packet loss estimates
  • Selection of network interfaces to stream the video
  • Options to record the live-streamed video feed to the companion computer
  • Manual control over resolution and framerates
  • Multiple camera support using RTSP
  • Hardware-accelerated H.264 encoding for the Raspberry Pi
  • Camera settings configurable through the APWeb GUI
  • Compatible with the Pi camera and several USB cameras such as the Logitech C920 and others

APStreamline doesn’t have any strict hardware requirements and can work with WiFi routers, WiFi dongles, and even over LTE with the right hardware.

Download APSync with APStreamline built in from here (Raspberry Pi Only)!

Please note that being a beta release, there may be bugs with this image. For any bug reports and/or suggestions, please comment on my pull request here.

1 Like

I got it up and running… but I get like a 1 or 2s lag all the time

Setup 1:

  • Raspi connected through eth0 to local LAN
  • Cell + MPlayer @ home wifi
  • Computer + VLC @ home eth0
  • Any resolution and framerate
  • Can’t get Mission planner display the video

Setup2:

  • Raspi as a hotspot
  • Cell + MPlayer @ ardupilot wifi
  • Computer + VLC @ ardupilot wifi
  • Any resolution and framerate
  • Can’t get Mission planner display the video

Sad I can’t use the “record” feature as I have a Pi Camera… any chance to get it running like that? I found this post when I was looking a way to use “tee” and “socat” to get precisely this point sorted out…

Thanks for trying it out. The 1 or 2s of lag is most likely due to VLC which uses a 2sec RTSP jitterbuffer. Using a gst-launch command like gst-launch-1.0 playbin uri=<RTSP-MOUNT-POINT> latency=100 for viewing the video can significantly reduce latency.

Did you change the RTSP video settings for streaming the video from MP?

I’m afraid that recording video with the Pi Camera was very buggy and it kept crashing the pipeline when I tried it out. You can try recording video using the C920HD.

Hi. Yes, i know about the VLC lag… I thus tried with my cell (although I can’t tell you how much time MPlayer may add). I also changed the streaming setting in MP to no avail.

Sorry but I won’t be able to test it again in the next few days. I tried this one here looking for a way to send and record video, but it won’t help with the Pi camera that I have and I had to roll back to the previous build (yesss, I need to get myself a few spare SD cards :persevere: ) . I’ll post back when I have a chance.

Great work @shortstheory

The image works well with Qgroundcontrol on Android and Qgroundcontrol on Win10. Only Missionplanner on Win10 does not show the video stream while Mavlink via UDP also works on the MP.

1 Like

I think the raspian you built it on has a problem. It always hangs on reboot, easy to test, just connect remotely and sudo reboot, it hangs.

regards,

Corrado

Hi, I’ve tried this out yesterday on an RPi3 and I didn’t have this problem when rebooting. Could you share more details about your setup?

Hi, nothing special, copied the image and booted the raspberry to configure my settings and noticed it hangs up on reboot. It did it since the beginning. All i did till now is disable bluetooth to have serial available (wich in my opinion should be default in the image) and configure dhcpcd.conf to set my static ip. It works fine but panics as soon as i sudo reboot it.

I’m getting the same hang on reboot as well… on a Raspberry Pi 3B+

@ shortstheory - Can you post instructions on how you build the image?

Also, does anyone have the previous image that was running fine on Jessie? Looks like it has been removed from the download site…

I built the image by running the scripts in the order given in this repo: https://github.com/ArduPilot/companion/tree/master/RPI2/Raspbian

If that doesn’t work properly, you can try installing APStreamline and APWeb on an fresh install of Raspbian Lite. Let me know how it goes!

Hi Arnav and thank you for the reply.

I’ve tried to install it on a new Raspbian Stretch Lite by following the instructions on the master repo and added in the changes from your merge request (found at APStreamline in companion by shortstheory · Pull Request #44 · ArduPilot/companion · GitHub)

I can run the RTSP stream_server manually (sudo ./stream_server eth0) and it recognizes both of my cameras (the RPi Camera and a Logitech C920) connected on the Pi.

Below are the streams it generates:

/dev/video0 (mmal service 16.1): rtsp://192.168.100.252:8554/cam0
/dev/video1 (HD Pro Webcam C920): rtsp://192.168.100.252:8554/cam4
/dev/video10 (bcm2835-codec): rtsp://192.168.100.252:8554/cam1
/dev/video11 (bcm2835-codec): rtsp://192.168.100.252:8554/cam2
/dev/video12 (bcm2835-codec): rtsp://192.168.100.252:8554/cam3
/dev/video2 (HD Pro Webcam C920): rtsp://192.168.100.252:8554/cam5

I can also receive with gst-launch-1.0 the [cam0] and [cam4] streams generated by the server.

But I cannot start the RTSP Server from within APweb. It looks like APweb doesn’t see the stream-server.
I do get a list of interfaces in the drop-down menu: lo, eth0 and wlan0 twice (which is weird) but when I click the start the server button nothing happens.

The good thing is that the Pi doesn’t hang on reboot anymore…!!!

Any ideas of how we might be able to fix the APweb Video Streaming page issue ?

Good to know that the reboot problem is fixed :slight_smile:

But I cannot start the RTSP Server from within APweb. It looks like APweb doesn’t see the stream-server.

This is because APWeb looks for the stream_server executable at $HOME/start_apstreamline/bin/stream_server. This was for matching the conventions of the executables in the APSync image.

To fix this, run the following in the folder you have cloned the APStreamline repo as mentioned in the build instructions you have linked:

meson build
cd build
meson configure -Dprefix=$HOME/start_apstreamline/
ninja install # installs to ~/start_apstreamline for APWeb to spawn the process

Do let me know if there are any problems after doing so!

Hi again,

At the moment the stream_server executable is installed in /home/apsync/start_apstreamline/bin/

I’ve just run it again:

meson build
cd build
meson configure -Dprefix=$HOME/start_apstreamline/
ninja install # installs to ~/start_apstreamline for APWeb to spawn the process

and it is installing it correctly in the /home/apsync/start_apstreamline/bin/ folder.

Below is the output:

apsync@apsync:~/GitHub/APStreamline/build $ sudo meson configure -Dprefix=$HOME/start_apstreamline/
apsync@apsync:~/GitHub/APStreamline/build $ sudo ninja install
[0/1] Regenerating build files.
The Meson build system
Version: 0.51.0
Source dir: /home/apsync/GitHub/APStreamline
Build dir: /home/apsync/GitHub/APStreamline/build
Build type: native build
Project name: adaptive-streaming
Project version: undefined
C compiler for the build machine: cc (gcc 6.3.0 “cc (Raspbian 6.3.0-18+rpi1+deb9u1) 6.3.0 20170516”)
C++ compiler for the build machine: c++ (gcc 6.3.0 “c++ (Raspbian 6.3.0-18+rpi1+deb9u1) 6.3.0 20170516”)
C compiler for the host machine: cc (gcc 6.3.0 “cc (Raspbian 6.3.0-18+rpi1+deb9u1) 6.3.0 20170516”)
C++ compiler for the host machine: c++ (gcc 6.3.0 “c++ (Raspbian 6.3.0-18+rpi1+deb9u1) 6.3.0 20170516”)
Build machine cpu family: arm
Build machine cpu: armv7l
Dependency gstreamer-1.0 found: YES (cached)
Dependency gstreamer-rtp-1.0 found: YES (cached)
Dependency gstreamer-rtsp-1.0 found: YES (cached)
Dependency gstreamer-rtsp-server-1.0 found: YES (cached)
Dependency threads found: YES (cached)
Build targets in project: 1
Found ninja-1.7.2 at /usr/bin/ninja
[0/1] Installing files.
Installing stream_server to /home/apsync/start_apstreamline/bin
apsync@apsync:~/GitHub/APStreamline/build $

But still APweb is not finding the server and the [Start the RTSP server] button does nothing…

Should I try moving the stream_server to the /home/apsync/start_apstreamline folder ?

The steps look fine and the installation location is correct. There might be some path issue in the version of APWeb which is supposed to spawn APStreamline. Furthermore, the local socket for APWeb to talk to APStreamline is in /tmp so ensure that the permissions for /tmp are correct

However, sudo is not required when doing “ninja install”, so just “ninja install” should suffice instead.

arnav dhamija

Arnav once again, thank you for the prompt reply…

OK… I think I’ve narrowed it down to the APweb installation, as the socket {rtsp_server.sock} is not coming up at all in the /tmp folder, when it is initiated from within the /apsync/video.html page.

Just to clarify.:
I’m installing APweb from your repo: GitHub - shortstheory/companion at pr_branch
by executing:

pushd GitHub/companion/RPI2/Raspbian
time sudo ./install_apweb 

Where in the APWeb/functions.c at video_streaming · shortstheory/APWeb · GitHub file (lines 810-813) define the entry for the path of the stream-server, as:

char* home_dir;
char stream_server_path[256];
home_dir = getenv("HOME");
sprintf(stream_server_path, "%s/start_apstreamline/bin/stream_server", home_dir);
printf("%s", stream_server_path);
if (execl(stream_server_path, "stream_server", argv[0], NULL)==-1) {
printf("Error in launching the stream server\n");

But after rebooting the RPi, the /home/apsync/start-apweb/screelog.0 log file reports:

+ ./web_server -p 80 -f 14755
Unable to connect to the stream server
(null)/start_apstreamline/bin/stream_server Error in launching the stream server
Unable to connect to the stream server

Correct me if I’m wrong, but this …(null)… entry in the log, means that τhe { home_dir = getenv(“HOME”) } variable is not getting ‘a value’ correctly for some reason - and that is why the [Start the RTSP server] button on the /apsync/video.html page does nothing.

Any other suggestions, to resolve the issue?

PS 1. you are right “sudo ninja install” or just “ninja install” makes no difference…

PS 2. CherryPy on <apsync_IP>:8000 does not work in your apsync-rpi-20180918172614.img.xz image.
Chrome (in my case) reports:

This page isn’t working
<apsync_IP. is currently unable to handle this request.
HTTP ERROR 500

Is this normal?
I can see that you are forcing the pip install of [cherrypy==10.2.1 jinja2].
Would {pip install cherrypy==17.4.0 jinja2 more-itertools==5.0.0} cause a problem on APStreamline?
Or APStreamline and CherryPy can not work concurrently ?

Thank you for your thorough investigation of the issue!

Correct me if I’m wrong, but this …(null)… entry in the log, means that τhe { home_dir = getenv(“HOME”) } variable is not getting ‘a value’ correctly for some reason - and that is why the [Start the RTSP server] button on the /apsync/video.html page does nothing.

Yes, this does seem to be the cause of the problem. I am not sure why this is occurring on your RPi setup for getenv works fine on my laptop. I shall have to try this out on the Pi to understand what’s happening.

PS 2. CherryPy on <apsync_IP>:8000 does not work in your apsync-rpi-20180918172614.img.xz image.

I do not remember if CherryPy was disabled in that build of APSync, but APStreamline is intended to replace CherryPy with far more flexible streaming options. IIRC, I think CherryPy was left in for providing UDP streaming for those who still needed it for legacy reasons. Perhaps @peterbarker can chime in to clarify?

As an aside, APSync has not been updated in a while. APStreamline does not depend on CherryPy so I see no issues with updating it. However, the UDP streaming offered by CherryPy cannot work while APStreamline is streaming using RTSP as only one process can use the camera at a time.

@shortstheory this is very cool stuff - thank you!
Question: anything I should be aware of installing APStreamline on a Jetson TX2? I read in the readme that the TX2 is supported by now but cant find any specific instructions.
I assume that I should remove my current cherrypy installation - correct?
Thanks for your support!

Yes, I have added supports for the TX2 a couple of months ago! However, the APSync image linked here is about a year old and doesn’t support for Jetson. You would have to build it on your Jetson by following the instructions in the README here: https://github.com/shortstheory/adaptive-streaming.

There is one caveat though - this is only for L4T 32.1 with GStreamer 1.14 and above which includes the nvarguscamerasrc GStreamer element. Unfortunately, there is a bug in nvarguscamerasrc which causes some issues with APStreamline’s RTSP server. The NVidia team has told me that this issue requires some significant work in L4T and they would patch it in a future release (https://devtalk.nvidia.com/default/topic/1051362/jetson-tx2/bug-in-nvarguscamerasrc-with-gstrtspserver)

Ideally, cherrypy can be removed unless you require UDP streaming for some application.

Thanks Arnav. I will give it a try with my current L4T 32.1 setup. As I understand this defect is only relevant for the onboard camera which comes with the TX2 dev kit - which I do not plan to use anyway.
In the meantime L4T 32.2 has been released, but reading through the release notes I could not find any evidence that this defect got fixed.
Will keep you posted on my results :slight_smile:

@shortstheory Hi Arnav, I tried APStreamline on my TX2 however ran into a number of issues:

  1. the Start RTSP Server button doesnt work due to the missing $HOME variable setting when you try to launch the web_server from /etc/rc.local (same problem as reported by @7bpm above) Problem is that rc.local is executed under root and therefore $HOME is not set to the apsync user directory. There would be a simple solution if you would include the path to the stream-server into a config file instead of hard coding it to $HOME.

  2. I am getting an error when I try to display the stream with gst-launch. See screenshot:

The server seems to be up and running correctly, this is what I see in the log:

Other question: How can I display the stream in MissionPlanner/QGC or other ground control stations who rely on an UDP connection?

1 Like