APSync with APStreamline (BETA)!

Hey, thanks for the update!

  1. I see. Are you using a new install of L4T on your Jetson or are you using the APSync image. If it’s the latter, we can try fixing it in the next image of APSync built for the Jetson. Alternatively, do you feel it would be better for the path to stream_server to be read by the APWeb server through a config fule? I will discuss this with @peterbarker as well.

  2. Does this issue occur every time the stream is started with gst-launch or every time after the first? The bug with L4T 32.1 and RTSP servers might be causing this if it’s the latter, as linked in the NVidia thread I’ve linked in my previous reply.

  3. MissionPlanner/QGC both support RTSP streaming using the URL as well :slight_smile: However for UDP applications you would have to revert to using the cherrypy server.

Thanks Arnav for getting back.

Actually using an image file is not the best path forward if you have a whole bunch of other stuff running on your machine :wink:
In the meantime I adopted apsync to operate with Jetpack 4.2 (and 4.2.1) on the TX2 including the APStreamline code: companion/Nvidia_JTX2_JP42/Ubuntu at next-tx2 · mtbsteve/companion · GitHub

I temporarily fixed the $HOME path issue in functions.c of your APWeb fork (pull request follows) to allow to launch stream_server within apsync. However the good solution would be to provide a config file to declare the location of stream_server.

In the example above I accidentally used the lo interface. When using the wlan0 interface, streaming via RTSP works.:slight_smile: Now to the nvarguscamerasrc problem:
Streaming to QGroundControl using their RTSP implementation works seamlessly. I can start and stop the streaming server w/o problems.
However if I stream to a client by using gst-launch-1.0 playbin uri=<RTSP-MOUNT-POINT> latency=100 I could reproduce the issue that the streaming breaks after the first successful launch.

I figured this out for QGC in the meantime but not yet for MP. Nevertheless I am also using other GC’s like Solex. So therefore I keep cherrypy as part of my apsync build.

In the meantime I adopted apsync to operate with Jetpack 4.2 (and 4.2.1) on the TX2 including the APStreamline code: companion/Nvidia_JTX2_JP42/Ubuntu at next-tx2 · mtbsteve/companion · GitHub

Great! Glad to see this working :slight_smile:

I temporarily fixed the $HOME path issue in functions.c of your APWeb fork (pull request follows) to allow to launch stream_server within apsync. However the good solution would be to provide a config file to declare the location of stream_server .

I am looking into the issue with rc.local and the issue with /home will hopefully be fixed by the APSync release.

Streaming to QGroundControl using their RTSP implementation works seamlessly. I can start and stop the streaming server w/o problems.

Interesting. I believe QGC using GStreamer as the backend so I would expect the problem to persist, only moreso because the problem is with nvarguscamerasrc on the sender’s side. IIRC, the session is still alive when the stream is paused/played, but the issue occurs when the session is closed and then reopened.

Hi Guys, i just installed APsync (apsync-rpi-ubuntu-t265-latest.img) on my Rpi 3B+ with Arducam v2.
it shows correctly parameters of my pixhawk cube, but there’s no “video streaming” link on the home page. What i’m missing? What truky interest me is video streaming.
opening page with the :8000 port gives an error page not found.
also i put a huawei 3372h 4g usb dongle, it’s not recognized, i don’t know why.

Hi Steve! APStreamline is not merged with master of APSync so you will need to follow the instructions in the GitHub repository (https://github.com/shortstheory/APStreamline) to build and install APStreamline yourself on the APSync image.

As for the video streaming controls in the webserver, you will need to install a patched version of APWeb. Instructions for doing so can be found here.

Arducam might not be natively supported by APStreamline as I haven’t gotten the chance to try it out with the Raspberry Pi yet. If I can get my hands on one, I will try to add support at some point in the future. I haven’t ever tried using the 4G module so I can’t help out with that, sorry!

hi, thanks for reply! I just had a look at the APstreamline link and i see i’m using a v1 while there’s V2 to try out (just put in my todo)

i successfully installed the APsynd image from the Rpanion website (apsync-Raspian-20191123030859.img.xz) which include APstreamline, which works with raspicam v2 ! i also tried an HDMI-to-CSi2 (TC358743 chip) with less success (image pop out of stream server then freeze and go away).
my main issues with this image are:

  • although video lag is close to zero (QGroundControl) i’m getting some “stuttering” and “flashing” (lost/corrupted frames?) out of RTSP stream when using raspicam over Wifi or LTE, althoug no issues when using ethernet.

  • Raspi cam image ratio seems to be erratic (set to 1.777777) showing big aliasing and low resolution however from time to time it seems to “adjust” by itself showing correct image for some seconds, then revert back to “distorted” one.

  • trying to fix stuttering issue, i would like to try to modify Gstreamer pipeline adding rtpjitteringbuffer. However, modifying the pipeline is a pain in the ass! It’s totally different from the official branch where you have to edit a single file. From what i could understand, pipeline is created dinamically and compiled (at boot or when server button is pressed? i dunno) from a native cpp file, then it calls a binary when i press start RTSP server in the webpage. I could not find how to modify it, any hint apreciated.

  • hdmi-to-csi: whole new world of bugs to discover :slight_smile: flipped image, ultra lagged, stays for some seconds then it goes away “no image” until i restart it. I will do some more testing with different boards, eventually HDMI-to-USB UVC converter also

  • When pinging raspi i get some 5-6 fast responses (<20ms) then one on 3000ms then again 5-6 fast ones etc. Also SSH has regular momentary lags which reflect this. I suspect some power saving functions on wifi/LTE devices could be causing momentary lags, although i disabled power saving on wlan etc and it didn’t fix it.

the good:

  • pixhawk connection worked at first attemp. I didn’t try it much more than bench (no fly, no rc overrides, etc) so i don’t know if there’s some stuttering over there too.
  • zerotier also works, however when switching route from ethernet to, let’s say, 4G LTE it takes ages to reconnect (5-10 mins). I think it has to “reroute” the whole thing globally. i’ve no idea how to speed up it

i want to thank you for the APstreamiline, i think it’s a great project. I would be happy to help you to sort out problems bug etc

Arnav, pls help: i cannot compile the “patched version” (git cloned) of APWeb, following your instructions, here’s the compile error: (it’s the final part, other modules before this compile ok):

gcc -Wall -g -Werror -std=gnu99 -c -o rtsp_ipc.o rtsp_ipc.c
rtsp_ipc.c: In function ‘get_interfaces_list’:
rtsp_ipc.c:133:17: error: passing argument 1 to restrict-qualified parameter aliases with argument 3 [-Werror=restrict]
sprintf(interface_list, “%s”%s"", interface_list, ifa->ifa_name);
^~~~~~~
rtsp_ipc.c:135:17: error: passing argument 1 to restrict-qualified parameter aliases with argument 3 [-Werror=restrict]
sprintf(interface_list, “%s,”%s"", interface_list, ifa->ifa_name);
^~~~~~~
rtsp_ipc.c:140:5: error: passing argument 1 to restrict-qualified parameter aliases with argument 3 [-Werror=restrict]
sprintf(interface_list, “%s]}”, interface_list);
^~~~~~~
cc1: all warnings being treated as errors
make: *** [: rtsp_ipc.o] Error 1

compiling it on a x86 VM does not give errors and finishes correctly (although i cannot use binary from it)

Thanks for the detailed feedback Steve. The issues seem to stem from the adaptive quality control changing resolutions too frequently. I think you might prefer to use it at a fixed quality.

trying to fix stuttering issue, i would like to try to modify Gstreamer pipeline adding rtpjitteringbuffer. However, modifying the pipeline is a pain in the ass! It’s totally different from the official branch where you have to edit a single file. From what i could understand, pipeline is created dinamically and compiled (at boot or when server button is pressed? i dunno) from a native cpp file, then it calls a binary when i press start RTSP server in the webpage. I could not find how to modify it, any hint apreciated.

This procedure is much improved with APStreamline v2. All the pipelines are defined in config files (in the config/ directory) and then the actual GStreamer parsing and construction is done at runtime. I would actually suggest removing the v1 version you are currently using and replacing it with v2 as it’s much easier to work with and debug.

The other issues seem quite hardware related. Unfortunately I don’t have a HDMI-to-CSI converter to reproduce your issue. Does pinging the RasPi have issues when connected over ethernet?

Interesting. Let me try to reproduce this on my Raspberry Pi. I will get back to you with a fix.

ping over ethernet: no lags.

update on flashing/stuttering: i think i fixed flashing and improved a little bit network performance by switching from zerotier to tailscale. Stuttering remains, though. I will do some more testing with wired net etc trying to figure out if there’s a bottleneck somewhere.

about pipeline, i ended up getting that rtpjitterbuffer has to be put into receiver pipeline :slight_smile: not in the source one, so no need to modify yours, at least as far as i know (i.e. if there’s some option to optimize it, Gstreamer it’s a whole new world for me).

I will try fixed resolution, but i don’t know how to setup it because i used to do it by APWeb which is broken right now. Is there a command line parameter to do that?

actually, when tried V1 i tried fixed resolutions and i saw that lowering resolution/fps made it worse! the lower fps/res the more it stutters… very strange

today i tried a HDMI-to-USB (UVC) device:
(https://www.amazon.com/HDMI-USB-Capture-Grabber-Camcorders-Windows/dp/B07TFGJH1P/) with a panasonic gh5 amera attached.
Apstreamline (v2) can get the camera item but it somewhat crashes, here’s a dump:

APStreamline
Access the following video streams using VLC or gst-launch following the instructions here: https://github.com/shortstheory/adaptive-streaming#usage

/dev/video0 (FHD Capture: FHD Capture): rtsp://100.xxx.xxx.xx:8554/cam3
/dev/video1 (FHD Capture: FHD Capture): rtsp://100.xxx.xxx.xx:8554/cam4
/dev/video10 (bcm2835-codec-decode): rtsp://100.xxx.xxx.xx:8554/cam0
/dev/video11 (bcm2835-codec-encode): rtsp://100.xxx.xxx.xx:8554/cam1
/dev/video12 (bcm2835-codec-isp): rtsp://100.xxx.xxx.xx:8554/cam2

(stream_server:1152): GStreamer-CRITICAL **: 18:45:54.244: gst_element_set_state: assertion ‘GST_IS_ELEMENT (element)’ failed

(stream_server:1152): GStreamer-CRITICAL **: 18:45:54.244: gst_object_unref: assertion ‘object != NULL’ failed

(stream_server:1152): GStreamer-CRITICAL **: 18:45:54.245: gst_element_set_state: assertion ‘GST_IS_ELEMENT (element)’ failed

(stream_server:1152): GStreamer-CRITICAL **: 18:45:54.246: gst_object_unref: assertion ‘object != NULL’ failed
Stream disconnected!

(stream_server:1152): GStreamer-CRITICAL **: 18:45:54.685: gst_element_set_state: assertion ‘GST_IS_ELEMENT (element)’ failed

(stream_server:1152): GStreamer-CRITICAL **: 18:45:54.685: gst_object_unref: assertion ‘object != NULL’ failed

(stream_server:1152): GStreamer-CRITICAL **: 18:45:54.685: gst_element_set_state: assertion ‘GST_IS_ELEMENT (element)’ failed

(stream_server:1152): GStreamer-CRITICAL **: 18:45:54.685: gst_object_unref: assertion ‘object != NULL’ failed
Stream disconnected!

(stream_server:1152): GStreamer-CRITICAL **: 18:45:55.149: gst_element_set_state: assertion ‘GST_IS_ELEMENT (element)’ failed

(stream_server:1152): GStreamer-CRITICAL **: 18:45:55.149: gst_object_unref: assertion ‘object != NULL’ failed

(stream_server:1152): GStreamer-CRITICAL **: 18:45:55.149: gst_element_set_state: assertion ‘GST_IS_ELEMENT (element)’ failed

(stream_server:1152): GStreamer-CRITICAL **: 18:45:55.149: gst_object_unref: assertion ‘object != NULL’ failed
Stream disconnected!

I see. That is because it doesn’t support this camera yet. Do the instructions for adding a new camera help?

oh, i missed those instructions! i found them in the github, however, i think those instructions are above my programming skills, in particular step 3. Anyway i will give a try