So how do you interface the RPI3 with the drone?
You can flash APSync on to the Raspberry Pi by following the instructions from here: http://ardupilot.org/dev/docs/apsync-intro.html
From here, you need to wire up the Pixhawk to the RPi’s GPIO pins for downloading the parameters from the Pixhawk: http://ardupilot.org/dev/docs/raspberry-pi-via-mavlink.html#raspberry-pi-via-mavlink
The Sololink does use GStreamer, but if I understood what I read correctly, the version of Yocto Linux that runs on the companion computer that we use is so outdated that GStreamer 1.x with hardware accel is not possible. I’m not sure if your APStreamline would still net any advantage given that constraint, but I am curious/hopeful that it could still improve things for us Solo users.
Will this also work on a intel edison?
While it might be theoretically possible, the Edison probably doesn’t get the latest version of gstreamer needed for running this
I totally love this. I have one question: I have been recently playing around with ezwifi broadcast and I must say it’s just great. Fast connection, almost no latency, very robust. How is this project compared to that?
The wifibroadcast project is very interesting but it serves a more specific use case than APStreamline.
APStreamline can work with any WiFi card as it doesn’t depend on using injection mode. I had tried using the TPLink 722N, but it seems like TPLink changed the WiFi chip in the v2/v3 version making it incompatible with wifibroadcast. APStreaminline also automatically finds the type of camera and creates a streaming pipeline accordingly so it can work both hardware and software encoding cameras and since it’s included with APSync, the setup is minimal. The video feed can be streamed to any device having an RTSP player such as VLC and can also be viewed directly from the GCS.
However, this flexibility comes at a price of less robust connections as we are completely dependent on the limited range of the WiFi hardware. Using a better WiFi router on the aircraft can improve connection quality dramatically.
I totally love the effort you put in the project. I did not have good results with classical streaming in the past (but that might just be me)
This is seriously cool, and your work is in line with what I have come to expect from GSoC projects, thanks for your efforts
How much Pi is required? Can I get by with a Zero or ZeroW? Thanks!
Thanks You should be able to get away with making this work on Pi Zero (W) if has any packaged version of the GStreamer libraries including and after v1.10. I wouldn’t use anything other than the RPi camera with it as the CPU is quite a bottleneck on the Pi Zero. Personally I haven’t tested it on armv6 though.
Great , thanks! I’ll report back if I find anything useful
great job! Is this implementable on other CCs like IMX6? Could it be run on Ubuntu?
Yes, it can run on any device with Ubuntu provided it has the GStreamer libraries installed.
FYI, have installed on the Pi Zero W but constantly getting Segmentation Faults after a few seconds of streaming.
What are the chances that this version of APWeb and the APStreamline can be loaded to the SkyViper’s Sonix? #arducopter:skyviper
I’ve read somewhere that it has a bit of space left…
I’m interested in video streaming range not super long say around 1km, what seems to be the limiting factor regarding range on Streamline vs say Ez -wifibroardcast? The range achieved by EZ Wifi using fairly standard wifi dongles seems very large in some cases.
I like the idea of streamline as (if im correct) I can just plug a more powerful wifi dongle/card into a computer running MP on the RX side, instead of another Pi and battery RX on EZ wifi.
Adding a bit of latency isn’t a huge issue for me (I think).
@Saijin_Naib I recently purchased a Wyze home monitoring Wireless camera (1080p), and it has a feature that I did not expect (for a $25 wifi camera).
I can log into it remotely (wifi or cellular) and view and record live video. However, as is normal, network lag can cause the live video to be broken and stuttery. However, even when recording to your remote device (not the internal SD card) when you play the video back, it is PERFECTLY in sync. Absolutely NO LAG whatsoever…including sound
What I discovered after much observation is that somehow the camera and the remote device (in this case the phone) is recognizing the lag and then sending the “missed” frames again (somehow). Then when it assembles the final video, it puts all the data back into the encoded video file, fully assembled and in the right order. This results in a lag free video during playback, even though the live video was not.
I have not tested the limits of the apparent “frame caching” (as in how long it could lag without missing a frame), but it is an incredibly effective method of getting good video…especially for such a cheap device.
This is slightly off-topic of your APStreamline which aims to create lag free LIVE video, but I think it is an idea worth exploring for lag free recorded video to a remote device (especially for products like the #arducopter:skyviper) which would allow you to record to the phone lag free full resolution video without the requirement for a SD cad. I haven’t analyzed any eNet packets, but I imagine it is still using UDP (so the throughput is high enough) but might be communicating back to the camera over TCP some frame metrics and letting it know which frames it didn’t get.
Thanks for trying it out. Could you share some more details about your setup including the GStreamer version, version of Ubuntu, and the camera you’re using so I can reproduce this? I haven’t tested APStreamline on ARMv6 yet.
The problem with Ez-wifibroadcast is that most of the WiFi dongles it used are no longer available. TPLink has unfortunately upgraded the WiFi chipsets in its dongles and routers to more proprietary Mediatek/Realtek chips which don’t have the right drivers for injection mode.
That is correct, you can boost the range by upgrading the WiFi router. A Cantenna might be a cheap and interesting option to boost the range (especially after pairing it with Ardupilot’s AntennaTracker). Latency is configurable in APStreamline and the trade-off is that the jitter will increase as the latency is reduced.
That’s pretty interesting. As we do use UDP there are limitations on the way we get dropped packets, but GstRTSPServer does offer an interesting option to set retransmission time using gst_rtsp_media_factory_set_retransmission_time with gst_rtsp_media_factory_set_profiles (factory, GST_RTSP_PROFILE_AVPF). I assume this RTX time will have to be smaller than the size of the jitterbuffer in millisecs. I haven’t tested this out yet though.
APStreamline favours flexibility over feature support. While recording video locally would be a great benefit, it won’t be possible on devices like smartphones and tablets without creating an app for the same. For Linux PCs, I can write a script to save the video to a file on the fly while displaying it on the screen. Thanks for the suggestions!