Bebop2 Video stream


I think that with a bit of work it could be feasible to get the disco software to work but it will require patching the kernel at least and testing stuff here and there.



Hi, I am facing similar issues myself. Is there any place where I can get “pimpctl” and associated libraries for running the video stream from bebop2.



@tridge, or someone who has Disco, could you please post pimpctl from your disco filesystem, because we would like to test it on bebop 2 for sending / recording video.

Hi Julien,

I tried building gstreamer for bebop2 for this purpose and while copying files I accidentally deleted some libraries. The bebop2 now just fails to boot up. Is there a way I can flash it? I remember looking at your video where you mentioned a specific cable is needed. If I arrange that, is it possible to get the tools needed to flash the bebop?
Thank you for your time and help.


pimpctl would not be enough, it is a control tool for the video processing pipeline, so it requires the video processing pipeline itself (based on gstreamer).
Building the pipeline from bottom up is a long task.

About deleting some libraries, if your not booting, you are in trouble and you may then require a USB cable with some trick.
Please refer to the ardupilot section of the wiki to use the uart cable and see what is preventing the bebop to boot.
You may want to open a new topic in here for that.

1 Like

For reflashing the bebop visit the following:

hi @julienberaud , thank you for your answer regarding streaming with pimpctl. So, if I understand correctly, it means that the dragon-prog is a monolithic beast doing both flying and streaming at the same time? Sound a bit risky. So these two function cannot be run/started separately using the original firmware? For example telling the dragon to only operate the camera, but leave piloting to ardupilot?
The bebop 2 filesystem has a file containing some streaming commands:

case 0
    mt9v117_set_state 0x50 # Enter standby
case 1
    mt9v117_set_state 0x34 # Enter streaming

it’s in /usr/bin/
What is the functionality of this file?

It looks like a utility for vertical camera. I don’t know if or how it’s used. Normally vertical camera is programmed by the software.
So yes exactly, you got it, it is a bit risky :slight_smile:
The more recent parrot drones (anafi) handles different functionalities in different processes for safety.
Dragon is too monolithic to be told to do only one thing over two, or at least was last time I worked at parrot.
Thing is the camera processing requires some data from the autopilot, like you can see on my implementation of disco video streaming.


My idea is to place a mencoder or ffmpeg compiled with gnueabihf, and start recording from v4l2 device under linux on Bebop2.

Is there a parameter in Ardupilot which specifies which external program to start when a certain button is pressed on the controller? Or otherwise a possibility to run a custom executable started by the ardupilot under certain conditions?

1 Like


Is there a parameter in Ardupilot which specifies which external program to start when a certain button is pressed on the controller? Or otherwise a possibility to run a custom executable started by the ardupilot under certain conditions?

You can take a look at the init scripts like explained in my tutorial to start ardupilot but be careful, a bad manipulation and your software will be bricked (no linux console starting for instance).

My idea is to place a mencoder or ffmpeg compiled with gnueabihf, and start recording from v4l2 device under linux on Bebop2.

This part is not really an easy one. You can take a look at the work from the paparazzi uav team who managed to get something basically working, capturing frames here and there, so for fun it’s ok. Now in order to get a quality even just like the one on Ardupilot disco, you’d have to

  • Understand how the Parrot7 SoC’s Image Signal Processor works
  • Implement both Auto-Exposure and Auto-White balance algorithms
  • Debayer the v4l2 stream and use the GPU to flatten the fish-eyed image
  • Encode it in software, which wouldn’t be of great quality on a Cortex A9

This is a lot of fun but doing this by reverse engineering would probably take a lot of time, and the quality would still not be the same as the one on the original video stream because some algorithms would still be missing. So I think the best option would be to make what is already on the Ardupilot Disco work with some help from the guys at Parrot.


@julienberaud, I want to leave all the difficult, power consuming video processing tasks to my laptop which has much more power than the Parrot CPU. Besides, why to waste power on it while flying?
Maybe only do debayer filter and Auto-Exposure for sending color correct video preview to the ground station, nothing more. Then I will do deshaking, dewarping, de-fish-eyeing, all the difficult tasks on the laptop CPU when on the ground :slight_smile:
I know, some of these functions cannot be done correctly without detailed input from the flight sensor data :man_facepalming:

Regarding running arupilot program on Bebop2, no problem with that, I just don’t know how to start something else from the ardupilot, during flight, without physically accessing the parrot or logging in by telnet or adb. In fact, I don’t even know how to switch flight modes by pressing a button on the joystick. (I don’t use tx, just connect by parrot wifi), so any help in these areas would be highly appreciated. I will have look into paparazzi uav. :+1:

After looking into that, I think I will leave out the camera drone thing altogether. It’s just not worth it. And it’s also too difficult.
Besides, I can’t fly it beyond visual range anyway without having special licencing. Thanks for your help.

You are welcome. If you need information please ask.

using netcat I am able to stream a video which is already saved in the bebop2 . Is there any possibility that I can stream the /dev/video* from the bebop using netcat so as to get the live feed?

So far I’m able to run FULL version of Ubuntu linux on the Parrot Bebop. Using roughly these steps:

Download Ubuntu for armhf cloud image from here:
Filename is: ubuntu-14.04-server-cloudimg-armhf-root.tar

Copy it to the parrot filesystem by ftp.
If you are connected by wifi:
If you are connected by USB cable:
Create a new directory and place it there.

Press the OnOff button on parrot 4 times.
Login to parrot Bebop by telnet:
telnet or
inside parrot do:
/usr/bin/shpoison_cli --start_debug --auto_retry

Unpack the previously placed file using this command:

cd /data/ftp/internal_000/ubuntu/
tar xJvf ./ubuntu-14.04-server-cloudimg-armhf-root.tar.xz

Start the ubuntu chroot system using these commands:

cd /data/ftp/internal_000/ubuntu/

mount -t proc proc proc/
mount --rbind /sys sys/
mount --rbind /dev dev/

chroot /data/ftp/internal_000/ubuntu/

Enjoy using ubuntu Linux on your Parrot drone!

Additional commands running the gstreamer etc from inside the Bebop:

cat /etc/apt/sources.list
#default is :
> deb trusty main restricted universe multiverse
> deb trusty-updates main restricted universe multiverse
> deb trusty-backports main restricted universe multiverse
> deb trusty-security main restricted universe multiverse

apt update
apt install [...packages..]

apt-get install qv4l2 # for testing the camera capabillities

qv4l2  /dev/video0 &
qv4l2  /dev/video1 &
v4l2-ctl --all -d /dev/video1

mplayer tv:// -nosound -v -tv driver=v4l2:width=672:height=600:input=0:device=/dev/video1:fps=20:outfmt=rgb24

mencoder tv:// -nosound -v -tv driver=v4l2:width=320:height=240:input=0:device=/dev/video0:outfmt=RGB32 -ovc lavc vcodec=mpeg4:turbo -o /home/test.mp4

gst-launch -v v4l2src device=/dev/video0 ! video/x-raw-yuv,format=\(fourcc\)UYVY,width=320,height=240 ! ffmpegcolorspace ! autovideosink
gst-launch -v v4l2src device=/dev/video0 ! video/x-raw-yuv,framerate=30/1,width=320,height=240 ! ffmpegcolorspace ! autovideosink

gst-launch -v v4l2src device=/dev/video1 ! video/x-raw-yuv,framerate=30/1,width=1344,height=2112 ! ffmpegcolorspace ! autovideosink

#on bebop = sender:
gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=320,height=240' ! ffmpegcolorspace ! smokeenc qmin=1 qmax=50 ! udpsink port=5000 host= sync=false

#on linux = receiver:
gst-launch-0.10  udpsrc port=5000 ! smokedec ! xvimagesink

#streaming to QGroundControl on the tablet, which address is, using the bottom camera:
gst-launch -v v4l2src device=/dev/video0 ! video/x-raw-yuv,framerate=30/1,width=320,height=240 ! x264enc pass=qual quantizer=20 tune=zerolatency ! rtph264pay ! udpsink host= port=5600

#streaming from front camera is clearly missing bayer2rgb filter:
gst-launch -v v4l2src device=/dev/video1 ! video/x-raw-yuv,framerate=30/1,width=672,height=600 ! x264enc pass=qual quantizer=20 tune=zerolatency ! rtph264pay ! udpsink host= port=5600

#connecting by wifi to external addresses:
iwlist scanning| grep ESSID
wpa_passphrase 'MyWifi' 'PASSWORD' > /etc/wpa_supplicant.conf
cat /etc/wpa_supplicant.conf

wpa_supplicant -B -D wext -i eth0 -c /etc/wpa_supplicant.conf 

dhclient eth0 -r -v

Is there a way to do debayer using the gstreamer pipeline?

So you are not really running a “FULL version of ubuntu” but rather doing a chroot on an ubuntu rootfs.
The kernel is still the original Parrot kernel (3.4).
The problem is that to you don’t only need to debayer, you also need to configure the camera sensor correctly. What you have is probably the last settings from the original firmware.
Your idea of not debayering on the target is interesting but I am not sure you understand why it’s done like that originally.
At 30fps, with images that allow for digital stabilization, i.e something like twice 1080p to be able to obtain stabilized 1080p, you have something like 1080x1920x2 = 2MPixels = 4MB per image, = 120MB/s.
How are you going to record or transfer that ?
Not to mention that you have on board all the sensor data necessary to do a proper digital stabilization that you’d have to record as well.
So in the end, you should trust that the guys developing this product did what was the best possible solution, i.e use the hardware on it to process the video stream.
Debayering using gstreamer means doing it in software and that won’t work. Also, in order to use gstreamer to encode the data, you’ll need a plugin to handle the hardware encoder, because encoding in software at full HD is not possible on such a small CPU.

1 Like

Thank you @julienberaud for your feedback. On a slightly different note, can the disco gps module be used with other FC like for example pixhawk etc?

What are the connectors TP6-TP11, which one is RX TX SCL SDA?
I have nothing left but this from my drone :frowning: I want to use its GPS baro mag !
Maybe do you have any schematic?
Thanks in advance! You are the best!

Sorry I have no idea. I don’t think Parrot would agree to disclose its hardware schematics. You’ll have to figure that out yourself.

  • Non-reusable hardware.
  • Closed source video init and debayer algorithms.
  • Not speaking about other things like closed communication protocols between SC2 and the flying wings.

For these and similar reasons I will never buy from Parrot as long as I can avoid it.
Good that IBM didn’t go that way and opened the bus specs in 1970-80, otherwise we would not be typing these on PC-s. Experts would be instead using extra expensive CP/M for their computing needs, and Internet would be nowhere, just a distant concept.
Cheers! :wink:

everything works good until …apt-get install qv4l2

does ubuntu inside bebop shouls somehow get broadband ?