This weekend I have been playing with a full duplex transceiver version of the simple DVB setup that allows to use only one computer and one USRP as a transmitter and receiver. By using separate daughterboards I can use one side to transmit and the other side to receive. Using two sets and two frequencies the transceiver can be used for two-way video conferencing over the air 🙂
video
Simple time-lapse video with gtreamer and ffmpeg
I have experienced an unexpected spin-off while fooling around with GStreamer and my Logitech QuickCam Vision Pro 9000 webcam: A simple and easy way to autonomously capture and render time-lapse videos.
One of the advantages of webcams compared to other digital still and video cameras is that it can be controlled from a computer and the captured frames are transfered from the camera to the computer in real-time using the USB interface. This is pretty much the definition of a webcam and this feature is indeed very convenient for capturing time-lapse videos. Unfortunately, the image quality of webcams has not been anywhere near good enough to make nice looking time-lapse videos but this has changed over the last few years and is continuously improving. The results presented in this article are captured using my Logitech QuickCam Vision Pro 9000 – one of the greatest UVC cameras out there.
A Weekend with GStreamer
This weekend was dedicated to learning and experimenting with Gstreamer – an open source library and framework for constructing audio and video processing pipelines. Despite the weekend being spoiled by lots of bad luck (power outages, Internet down, etc.) I managed to beat the hell out of Murphy and get some work done!
My hidden agenda is of course finding a good audio/video library to accompany a software defined radio created using GNU Radio and the Universal Software Radio Peripheral (USRP), and to eventually be able to transmit real time high definition video over the air. While GNU Radio and the USRP can take care of everything related to software radio and RF, I am still looking for a good framework for flexible audio/video processing.
A Weekend with GStreamer
This weekend was dedicated to learning and experimenting with Gstreamer – an open source library and framework for constructing audio and video processing pipelines. Despite the weekend being spoiled by lots of bad luck (power outages, Internet down, etc.) I managed to beat the hell out of Murphy and get some work done!
My hidden agenda is of course finding a good audio/video library to accompany a software defined radio created using GNU Radio and the Universal Software Radio Peripheral (USRP), and to eventually be able to transmit real time high definition video over the air. While GNU Radio and the USRP can take care of everything related to software radio and RF, I am still looking for a good framework for flexible audio/video processing.
A simple way to get video in and out of GNU Radio
One of the things I want to do with GNU Radio and the USRP is video transmissions over radio. For this purpose I need a way to read video sources – including files, webcams and other video capture devices – and to display or save it on the other end.
I suppose the right way to do this is to create specific signal sources and sinks for GNU Radio. This can be done either by “direct access”, i.e. read the UVC device directly, or by using a higher level library like libvlc or the ffmpeg libraries (libav*). The latter has indeed been used for audio and the code is available from the Comprehensive GNU Radio Archive (CGRAN) under Mediatools.
For this experiment, however, I decided to try something simpler that that I can try and conclude within an evening: Use VLC as capture and playback applications and connect to GNU Radio using either the TCP or UDP interfaces.