FYI, have installed on the Pi Zero W but constantly getting Segmentation Faults after a few seconds of streaming.
What are the chances that this version of APWeb and the APStreamline can be loaded to the SkyViper’s Sonix? #arducopter:skyviper
I’ve read somewhere that it has a bit of space left…
I’m interested in video streaming range not super long say around 1km, what seems to be the limiting factor regarding range on Streamline vs say Ez -wifibroardcast? The range achieved by EZ Wifi using fairly standard wifi dongles seems very large in some cases.
I like the idea of streamline as (if im correct) I can just plug a more powerful wifi dongle/card into a computer running MP on the RX side, instead of another Pi and battery RX on EZ wifi.
Adding a bit of latency isn’t a huge issue for me (I think).
@Saijin_Naib I recently purchased a Wyze home monitoring Wireless camera (1080p), and it has a feature that I did not expect (for a $25 wifi camera).
I can log into it remotely (wifi or cellular) and view and record live video. However, as is normal, network lag can cause the live video to be broken and stuttery. However, even when recording to your remote device (not the internal SD card) when you play the video back, it is PERFECTLY in sync. Absolutely NO LAG whatsoever…including sound
What I discovered after much observation is that somehow the camera and the remote device (in this case the phone) is recognizing the lag and then sending the “missed” frames again (somehow). Then when it assembles the final video, it puts all the data back into the encoded video file, fully assembled and in the right order. This results in a lag free video during playback, even though the live video was not.
I have not tested the limits of the apparent “frame caching” (as in how long it could lag without missing a frame), but it is an incredibly effective method of getting good video…especially for such a cheap device.
This is slightly off-topic of your APStreamline which aims to create lag free LIVE video, but I think it is an idea worth exploring for lag free recorded video to a remote device (especially for products like the #arducopter:skyviper) which would allow you to record to the phone lag free full resolution video without the requirement for a SD cad. I haven’t analyzed any eNet packets, but I imagine it is still using UDP (so the throughput is high enough) but might be communicating back to the camera over TCP some frame metrics and letting it know which frames it didn’t get.
Thanks for trying it out. Could you share some more details about your setup including the GStreamer version, version of Ubuntu, and the camera you’re using so I can reproduce this? I haven’t tested APStreamline on ARMv6 yet.
The problem with Ez-wifibroadcast is that most of the WiFi dongles it used are no longer available. TPLink has unfortunately upgraded the WiFi chipsets in its dongles and routers to more proprietary Mediatek/Realtek chips which don’t have the right drivers for injection mode.
That is correct, you can boost the range by upgrading the WiFi router. A Cantenna might be a cheap and interesting option to boost the range (especially after pairing it with Ardupilot’s AntennaTracker). Latency is configurable in APStreamline and the trade-off is that the jitter will increase as the latency is reduced.
That’s pretty interesting. As we do use UDP there are limitations on the way we get dropped packets, but GstRTSPServer does offer an interesting option to set retransmission time using gst_rtsp_media_factory_set_retransmission_time with gst_rtsp_media_factory_set_profiles (factory, GST_RTSP_PROFILE_AVPF). I assume this RTX time will have to be smaller than the size of the jitterbuffer in millisecs. I haven’t tested this out yet though.
APStreamline favours flexibility over feature support. While recording video locally would be a great benefit, it won’t be possible on devices like smartphones and tablets without creating an app for the same. For Linux PCs, I can write a script to save the video to a file on the fly while displaying it on the screen. Thanks for the suggestions!
Hi, I’m using a Pi Zero W with GStreamer 1.0 on the latest version of Raspbian Stretch Lite with a Pi Camera. My setup is a little unique in that I’m streaming over 4G. I think i found part of the problem in that my memory split on the Pi wasn’t allocating enough memory to video so I’ve fixed that. I think the other issue is possibly the throughput of the 4G connection.
Question for you, if I use “stream_server interface” is the default behaviour “AUTO” resolution? If not how can I control this via the standalone method as I don’t want to install APWeb.
Thanks for all your hard work on this.
That is correct, the default setting is AUTO. Unfortunately, using APWeb is the only way to configure the camera resolution as it talks to the stream server over a local socket.
Can someone please help me on how to install the APStreamline forked branch of APweb onto an APSync working image?
There is already a /home/apsync/GitHub/APWeb directory on my APSync installation, so where do I clone the new APStreamilne-APWeb ?
Do I just clone it on top of the existing folder? Do I delete the existing folder and then just download the new one?
My apologies if my question sounds like coming from a newbie…
As of now, APStreamline has only been tested on a fresh install of Raspian on the Raspberry Pi 2/3B+. That said, the version of APWeb used is just a fork of the version used in APSync so it should be safe to delete the existing APSync’s APWeb version and replace it with the one patched for APStreamline.
Eventually, APStreamline will be merged as part of APSync so stay tuned for that
Will there be an image coming out soon for the Raspi zero?
I have one lying around that i would like to try.
Hi! I have tested it on the Pi Zero and it appears to work well. You can simply follow the build instructions given here and for the latest version of Raspian for the Pi Zero.
Great project! Has anyone measured the glass-to-glass latency?
Any news regarding the merge of APStreamline with the APSync project ?
Yes! In fact, I built an APSync image with APStreamline just yesterday. It is still untested and should be strictly used as an alpha release in every sense of the word. It will be merged back to master once it has been tested some more. Here is the link if you’re interested in trying it out: http://firmware.ardupilot.org/Companion/apsync/beta/apsync-rpi-20180918172614.img.xz
I’m using it with C920 and it works flawlessly.
Isit possible to output multiple simultaneous streams? For example, if I would like to use qgroundcontrol RTSP video feature and same time stream to my Motioneye server?