In a previous post I gave a few examples showing how simple text overlays can be added to any video stream in GStreamer. Now it’s time to look at compositing between two or more video streams, also called picture in picture. As you’ll see it is still very easy to achieve even when using nothing more than the gst-launch command line tool. First we look at some basic examples, then we finish with a more complex “Live from Pluto” video wall.
Simple Picture in Picture
Simple picture in picture effect can be created using the videomixer element. Videomixer uses alpha channels and information about the size of the input streams to create a composited output stream. The following example will put a 200×150 pixels snow test pattern over a 640×360 pixels SMPTE pattern:
gst-launch -e videotestsrc pattern="snow" ! video/x-raw-yuv, framerate=10/1, width=200, height=150 ! videomixer name=mix !
ffmpegcolorspace ! xvimagesink videotestsrc ! video/x-raw-yuv, framerate=10/1, width=640, height=360 ! mix.
It should look something like this:
According to the online documentation the position and Z-order can be adjusted using the GstVideoMixerPad properties; however, I do not yet know how to use this.
We can move the small video around anywhere using the videobox element with a transparent border. The videobox is inserted between the source video and the mixer:
The following pipeline will move the small snow pattern 25 pixels to the right and 20 pixels down:
gst-launch -e videotestsrc pattern="snow" ! video/x-raw-yuv, framerate=10/1, width=200, height=150 ! videobox border-alpha=0 top=-20 left=-25 !
videomixer name=mix ! ffmpegcolorspace ! xvimagesink videotestsrc ! video/x-raw-yuv, framerate=10/1, width=640, height=360 ! mix.
Note that the “top” and “left” values are negative, which means that pixels will be added to the incoming stream. Positive value means that pixels are cropped from the incoming stream. If we’d made “border-alpha” equal 1.0 we’d seen a black border on the top and the left of the child image extending to the edges of the window.
Adding Transparency
Transparency of each input stream can be controlled by passing the stream through an alpha filter. This is useful for the main (background) image. For the child image we do not need to add and additional alpha filter because the videobox can have it’s own alpha channel:
gst-launch -e videotestsrc pattern="snow" ! video/x-raw-yuv, framerate=10/1, width=200, height=150 !
videobox border-alpha=0 alpha=0.6 top=-20 left=-25 ! videomixer name=mix ! ffmpegcolorspace ! xvimagesink
videotestsrc ! video/x-raw-yuv, framerate=10/1, width=640, height=360 ! mix.
This was easy, wasn’t it?
A border can be added around the child image by adding an additional videobox where the top/left/right/bottom values correspond to the desired border width and “border-alpha” is set to 1.0 (opaque):
gst-launch -e videotestsrc pattern="snow" ! video/x-raw-yuv, framerate=10/1, width=200, height=150 !
videobox border-alpha=1.0 top=-2 bottom=-2 left=-2 right=-2 ! videobox border-alpha=0 alpha=0.6 top=-20 left=-25 !
videomixer name=mix ! ffmpegcolorspace ! xvimagesink videotestsrc ! video/x-raw-yuv, framerate=10/1, width=640, height=360 ! mix.
We are now ready to move on to a more complex example.
Live from Pluto – GStreamer Video Wall
Our robotic spaceship has landed on Pluto and is ready to transmit awesome video from the three onboard cameras CAM1, CAM2 and CAM3. We want to show the images on a video wall with a nice background, something like this:
We can accomplish this using picture-in-picture compositing but it is now a little more complicated than the simple examples shown earlier. We have:
- Three small video feeds of size 350×250 pixels
- Each small video feed has a
textoverlay
showing CAMx - A large background 1280×720 pixels coming from a still image (JPG file)
- A
textoverlay
saying “Live from Pluto” at the bottom left of the main screen - The three video feeds, CAM1, CAM2 and CAM3 are put on top of the main screen.
The diagram for the pipeline is shown below. The text above the arrows specifies the pixel formats for a given video stream in the pipeline. If the pipeline fails to launch due to an error that says something about streaming task paused, reason not-negotiated (-4), it is very often due to incompatible connection between two blocks. Gstreamer is not always very good at telling that.
And here is the complete pipeline as entered on the command line:
gst-launch -e videomixer name=mix ! ffmpegcolorspace ! xvimagesink
videotestsrc pattern=0 ! video/x-raw-yuv, framerate=1/1, width=350, height=250 !
textoverlay font-desc="Sans 24" text="CAM1" valign=top halign=left shaded-background=true !
videobox border-alpha=0 top=-200 left=-50 ! mix.
videotestsrc pattern="snow" ! video/x-raw-yuv, framerate=1/1, width=350, height=250 !
textoverlay font-desc="Sans 24" text="CAM2" valign=top halign=left shaded-background=true !
videobox border-alpha=0 top=-200 left=-450 ! mix.
videotestsrc pattern=13 ! video/x-raw-yuv, framerate=1/1, width=350, height=250 !
textoverlay font-desc="Sans 24" text="CAM3" valign=top halign=left shaded-background=true !
videobox border-alpha=0 top=-200 left=-850 ! mix.
multifilesrc location="pluto.jpg" caps="image/jpeg,framerate=1/1" ! jpegdec !
textoverlay font-desc="Sans 26" text="Live from Pluto" halign=left shaded-background=true auto-resize=false !
ffmpegcolorspace ! video/x-raw-yuv,format=(fourcc)AYUV ! mix.
A few notes about the pipeline:
- For the large background image, which is a still frame I wanted to use the “imagefreeze” block which generates a video stream from a single image file. Unfortunately, it seems that this block is very new and not in the gstreamer package that comes with Ubuntu 10.04. Thereofre, I had to do the trick with “multifilesrc
"
making it read the same file over and over again. - I had a hard time getting this pipeline work. Eventually I found out that my problems were due to incompatible caps in the miltifilesrc part of the pipeline. That’s why there is an extra color conversion block between the textoverlay and the videomixer.
- The videobox elements are used to add a transparent border to the small video feed causing the real video to be moved according to the pixels specified for the “top” and “left” parameters. I have later learned that it can be accomplished without using videobox, see my Gstreamer cheat sheet.
- The frame rate is set to 1 fps because I have been doing this on a slow computer. You should feel free to experiment with higher frame rates if you are sitting at a faster computer.
Here is a video showing our “Live from Pluto” video feed in action:
You can also watch the video on YouTube.
Simple Video Matrix
The Live From Pluto video wall was a really neat example, but in most cases we just need to create a simple video matrix where the incoming video streams are shown next to each other. The following GStreamer pipeline will show four 320×180 pixel video streams arranged in a 2×2 matrix resulting in a 640×360 output stream:
gst-launch -e videomixer name=mix ! ffmpegcolorspace ! xvimagesink
videotestsrc pattern=1 ! video/x-raw-yuv,framerate=5/1,width=320,height=180 ! videobox border-alpha=0 top=0 left=0 ! mix.
videotestsrc pattern=15 ! video/x-raw-yuv,framerate=5/1,width=320,height=180 ! videobox border-alpha=0 top=0 left=-320 ! mix.
videotestsrc pattern=13 ! video/x-raw-yuv,framerate=5/1,width=320,height=180 ! videobox border-alpha=0 top=-180 left=0 ! mix.
videotestsrc pattern=0 ! video/x-raw-yuv,framerate=5/1,width=320,height=180 ! videobox border-alpha=0 top=-180 left=-320 ! mix.
videotestsrc pattern=3 ! video/x-raw-yuv,framerate=5/1,width=640,height=360 ! mix.
Be sure to check out my Gstreamer cheat sheet which is a continuously updated wiki page and feel free to leave a comment if you have other tips and tricks.