Hi everybody, I have seen that from the last version of Mission Planner 1.3.50, it is possible to stream video directly to the HUD. Before I was doing that streaming to GStreamerHUDApp using this stream pipeline from the Raspberry:
raspivid -t 999999 -h 720 -w 1080 -fps 25 -b 500000 -o - | nohup gst-launch-1.0 -e -vvv fdsrc ! h264parse ! rtph264pay pt=96 config-interval=5 ! udpsink host=220.127.116.11 port=9000
and this to receive in the GStreamerHUDApp:
udpsrc port=9000 buffer-size=60000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! queue ! avdec_h264
Now I have tried to use the same, but it is not working. I have noticed that above the space to add the pipeline in Mission Planner is written
Ensure the final type is jpeg data (avenc_pjpeg) , but unfortunately from when I wrote my pipeline, I can’t recover the original pipeline to use it as example.
Can you tell me the original pipeline? And maybe also an example of transmitting and receiving pipelines?
year ago I succesfully send stream of Xplane to HUD in MP (from 1 PC with Xplane to another with MP via TCP), I cant remember exactly pipeline, but according to notes in MP source code this pipelines should work (for source and for MP):
gst-launch-1.0.exe videotestsrc pattern=ball ! video/x-raw,width=640,height=480 ! clockoverlay ! x264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5600
gst-launch-1.0.exe -v udpsrc port=5600 buffer-size=60000 ! application/x-rtp ! rtph264depay ! avdec_h264 ! queue leaky=2 ! avenc_mjpeg ! tcpserversink host=127.0.0.1 port=1235 sync=false
gst-launch-1.0.exe -v videotestsrc ! video/x-raw,format=BGRA,framerate=25/1 ! videoconvert ! autovideosink
gst-launch-1.0 videotestsrc pattern=ball ! x264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5600
gst-launch-1.0 udpsrc port=5600 caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264' ! rtph264depay ! avdec_h264 ! autovideosink fps-update-interval=1000 sync=false
gst-launch-1.0.exe videotestsrc ! video/x-raw, width=1280, height=720, framerate=25/1 ! x264enc ! rtph264pay ! udpsink port=1234 host=192.168.0.1
gst-launch-1.0.exe -v udpsrc port=1234 buffer-size=60000 ! application/x-rtp ! rtph264depay ! avdec_h264 ! queue ! avenc_mjpeg ! tcpserversink host=127.0.0.1 port=1235 sync=false
As for the avenc_mjpeg I assume the problem is with video encoder, Your pipeline parse stream from raspivid, so probably MP cant decode it.
Hi Nikita and thanks for your help.
I have tried to follow your examples and I found a combination that works:
raspivid -t 999999 -h 720 -w 1080 -fps 25 -b 500000 -o - | nohup gst-launch-1.0 -e -vvv fdsrc ! h264parse ! rtph264pay pt=96 config-interval=5 ! udpsink host=18.104.22.168 port=5600
gst-launch-1.0 -v videotestsrc ! video/x-raw,format=BGRA,framerate=25/1 ! videoconvert ! autovideosink
I have tried the others you proposed, but I am not able to stream from the Rpi Cam. Honestly I am quite a newbie in GStreamer, so I don’t know exactly if what I am doing is really correct. Can you have a look and let me know if I can improve?
I have tried the others you proposed, but I am not able to stream from the
Rpi Cam. Honestly I am quite a newbie in GStreamer, so I don’t know exactly
APSync on RPi accomplishes that.