Needing some assistance with visiond streaming. I just started using Maverick and am really liking the configuration options and how everything is in one place, very nice. I had been running APSync on a RPI 2 B, but was mainly only using the Mavlink proxy capability and not the camera streaming as the streaming seemed to be choppy which I just attributed to the RPI 2. In comes the RPI 3 B+. I decided to give Maverick a try as it seems to have the same capabilities. Once I was able to set everything up, I am still getting the choppy/stuttering video. The RPI CPU usage fluctuates between 20% and 70% for the visiond process. Here are the lines from the visiond log:
2018-07-12 13:23:25,889 - INFO - Starting maverick-visiond
2018-07-12 13:23:25,896 - INFO - Config read from /srv/maverick/config/vision/maverick-visiond.conf
2018-07-12 13:23:25,899 - DEBUG - V4l2 device input: Camera 1:2
2018-07-12 13:23:25,900 - DEBUG - driver: uvcvideo
2018-07-12 13:23:25,900 - DEBUG - card: USB2.0 PC CAMERA
2018-07-12 13:23:25,900 - DEBUG - Camera control: Brightness
2018-07-12 13:23:25,901 - DEBUG - Camera control: Contrast
2018-07-12 13:23:25,901 - DEBUG - Camera control: Saturation
2018-07-12 13:23:25,901 - DEBUG - Camera control: Hue
2018-07-12 13:23:25,902 - DEBUG - Camera control: Gamma
2018-07-12 13:23:25,902 - DEBUG - Camera control: Gain
2018-07-12 13:23:25,902 - DEBUG - Camera control: Power Line Frequency
2018-07-12 13:23:25,903 - DEBUG - Camera control: Sharpness
2018-07-12 13:23:25,903 - DEBUG - Camera format: Motion-JPEG
2018-07-12 13:23:25,903 - INFO - Creating stream object - camera:/dev/video0, stream:mjpeg, pixelformat:YUY2, encoder:h264, size:(640 x 480 / 30), output:rtsp, brightness:0
2018-07-12 13:23:25,904 - INFO - Attaching input ‘v4l2’: /dev/video0
2018-07-12 13:23:25,925 - INFO - Attaching stream 'mjpeg’
2018-07-12 13:23:26,064 - INFO - Attaching encoding 'h264’
2018-07-12 13:23:26,290 - INFO - No hardware encoder detected, using software x264 encoder
2018-07-12 13:23:26,389 - INFO - Attaching payload 'h264’
2018-07-12 13:23:26,433 - DEBUG - h264parse element created
2018-07-12 13:23:26,499 - DEBUG - Attaching h264pay to h264parse
2018-07-12 13:23:26,500 - INFO - Attaching output 'rtsp’
2018-07-12 13:23:26,517 - INFO - RTSP stream running at rtsp://0.0.0.0:5600/video
2018-07-12 13:23:26,522 - INFO - Pipeline: "v4l2-source /dev/video0 ! capsfilter ‘image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)30/1’ ! jpegparse ! jpegdec ! queue ! videoconvert ! x264-encode ! h264parse ! pay0"
2018-07-12 13:23:26,524 - INFO - Starting camera stream
I have tried multiple lower frame rates to try and alleviate the stuttering, but still can’t overcome the issue.
The log indicates No Hardware Encoder Detected. Is it not possible to use omxh264enc in the pipeline to enable hw encoding? Would this even help my situation?
I have tried overriding the pipeline, but get an Error constructing the pipeline, even if I use the above pipeline info from the log. Can you provide an example of what the override pipeline should be using the above log information?
Any other ideas on how to improve the pipeline to maybe improve the streaming and lower the cpu usage? Since the camera is providing mjpg frames, would it be better to use something like MJPG Streamer?
Really appreciate the work done on the project and any help is greatly appreciated!