You know, there are tonnes of undocumented things out there. Really, really cool technologies that should be getting used more are not getting used as much because there lacks decent docs. And, to make matters worse, the developers naturally just want to get on and write the software. So, I would like to urge everyone who reads this (and I am thinking you ‘orrible lot on Planet GNOME in particular) should write an article about something that you have discovered that isn’t particularly well documented. This could be a technique, a technology, skill or something. Lets get some Google juice pumping and get some extra docs to help people get started. 🙂
So, with this in mind, I am going to write a simple first guide to getting started with GStreamer using the excellent Python bindings. This tutorial should be of particular interest if you want to hack on Jokosher, Pitivi or Elisa as they, like many others, are written in Python and use GStreamer.
Ready? Right, lets get started with the pre-requisites. You will need the following:
- GStreamer 0.10
- Python
- PyGTK (often packaged as python-gtk2)
You will also need a text editor. Now, some of you will want to have a big ‘ole argument about which one that is. Come back in four hours and we can continue. 😛
An overview
So, what is GStreamer and how do they help you make multimedia applications? Well, GStreamer is a multimedia framework that allows you to easily create, edit and play multimedia by creating special pipelines with special multimedia elements.
GStreamer has a devilishly simple way of working. With GStreamer you create a pipeline, and it contains a bunch of elements that make that multimedia shizzle happen. This is very, very similar to pipelines on the Linux/BSD/UNIX command line. As an example, on the normal command line you may enter this command:
[email protected]:~$ ps ax | grep “apache” | wc -l
This command first grabs a process listing, then returns all the processes called “apache” and then feeds this list into the wc
command which counts the number of lines with the -l
switch. The result is a number that tells you how many instances of “apache” are running.
From this we can see that each command is linked with the |
symbol and the output of the command on the left of the |
is fed into the input on the command on the right of the |
. This eerily similar to how GStreamer works.
With GStreamer you string together elements, and each element does something in particular. To demonstrate this, find an Ogg file (such as my latest tune 😛 ), save it to a directory, cd
to that directory in a terminal and run the following command:
[email protected]:~$ gst-launch-0.10 filesrc location=jonobacon-beatingheart.ogg ! decodebin ! audioconvert ! alsasink
(you can press Ctrl-C to stop it)
When you run this, you should hear the track play. Lets look at what happened.
The gst-launch-0.10
command can be used to run GStreamer pipelines. You just pass the command the elements you want to play one by one, and each element is linked with the !
symbol. You can think of the !
as the |
in a normal command-line list of commands. The above pipeline contains a bunch of elements, so lets explain what they do:
filesrc
– this element loads a file from your disk. Next to the element you set itslocation
property to point to the file you want to load. More on properties later.decodebin
– you need something to decode the file from the filesrc, so you use this element. This element is a clever little dude, and it detects the type of file and automatically constructs some GStreamer elements in the background to decode it. So, for an Ogg Vorbis audio file, it actually uses theoggdemux
andvorbisdec
elements. Just mentally replace thedecodebin
part of the pipeline foroggdemux ! vorbisdec
and you get an idea of what is going on.audioconvert
– the kind of information in a sound file and the kind of information that needs to come out of your speakers are different, so we use this element to convert between them.alsasink
– this element spits audio to your sound card using ALSA.
So, as you can see, the pipeline works the same as the command-line pipeline we discussed earlier – each element feeds into the next element to do something interesting.
At this point you can start fiddling with pipelines and experimenting. To do this, you need to figure out which elements are available. You can do this by running the following command:
[email protected]:~$ gst-inspect-0.10
This lists all available elements, and you can use the command to find out details about a specific element, such as the filesrc
element:
[email protected]:~$ gst-inspect-0.10 filesrc
More about GStreamer
OK, lets get down and dirty about some of the GStreamer terminology. Some people get quite confused by some of the terms such as pads and caps, not to mention bins and ghost pads. It is all rather simple to understand when you get your head around it, so lets have a quick run around the houses and get to grips with it.
We have already discussed what a pipeline is, and that elements live on the pipeline. Each element has a number of properties. These are settings for that particular element (like knobs on a guitar amp). As an example, the volume
element (which sets the volume of a pipeline) has properties such as volume
which sets the volume and mute
which can be used to mute the element. When you create your own pipelines, you will set properties on a lot of elements.
Each element has virtual plugs in which data can flow in and out called pads. If you think of an element as a black box that does something to the information that is fed into it, on the left and right side of the box would be sockets in which you can plug in a cable to feed that information into the box. This is what pads do. Most elements have an input pad (called a sink and an output pad called a src). Using my l33t ASCII art mad skillz, this is how our pipeline above looks in terms of the pads:
[src] ! [sink src] ! [sink src] ! [sink]
The element on the far left only has a src pad as it only provides information (such as the filesrc
). The next few elements take information and do something to it, so they have sink and src pads (such as the decodebin
and audioconvert
elements), and the final element only receives information (such as the alsasink
). When you use the gst-inspect-0.10
command to look at an element’s details, it will tell you which pads the element has.
So, we know we have pads, and data flows through them from the first element on the pipeline to the last element, and now we need to talk about caps. Each element has particular caps and this says what kind of information the element takes (such as whether it takes audio or video). You can think of caps as the equivalent rules on a power socket that says that it takes electricity of a particular voltage.
Lets now talk about bins. A lot of people get confused about bins, and they are pretty simple. A bin is just a convenient way of collecting elements together into a container. As an example, you may have a bunch of elements that decode a video and apply some effects to it. To make this easier to handle, you could put these elements into a bin (which is like a container) and then you can just refer to that bin to in turn refer to those elements. As such, the bin becomes an element. As as an example, if your pipeline was a ! b ! c ! d
, you could put them all into mybin
and when you refer to mybin
, you are actually using a ! b ! c ! d
. Cool, huh?
Finally, this brings us onto ghost pads. When you create a bin and shove a bunch of elements in there, the bin then becomes your own custom element which in turn uses those elements in the bin. To do this, your bin naturally needs its own pads that hook up to the elements inside the bin. This is exactly what ghost pads are. When you create a bin, you create the ghost pads and tell them which elements inside the bin they hook up to. Simple. 🙂
Writing some code
To make this GStreamer goodness happen in a Python script, you only need to know a few core skills to get started. These are:
- Create a pipeline
- Create elements
- Add elements to the pipeline
- Link elements together
- Set it off playing
So, lets get started, we are going to create a program that does the equivalent of this:
[email protected]:~$ gst-launch-0.10 audiotestsrc ! alsasink
Here we use the audiotestsrc
element which just outputs an audible tone, and then feed that into an alsasink
so we can hear it via the sound card. Create a file called gstreeamertutorial-1.py and add the following code:
!/usr/bin/python
import pygst pygst.require(“0.10”) import gst import pygtk import gtk
class Main: def init(self): self.pipeline = gst.Pipeline(“mypipeline”)
self.audiotestsrc = gst.element_factory_make(“audiotestsrc”, “audio”) self.pipeline.add(self.audiotestsrc)
self.sink = gst.element_factory_make(“alsasink”, “sink”) self.pipeline.add(self.sink)
self.audiotestsrc.link(self.sink)
self.pipeline.set_state(gst.STATE_PLAYING)
start=Main() gtk.main()
Download the code for this script here.
So, lets explain how this works. First we import some important Python modules:
import pygst pygst.require(“0.10”) import gst import pygtk import gtk
Here the GStreamer modules (pygst and gst) are imported and we also use the gtk modules. We use the GTK modules so we can use the GTK mainloop. A mainloop is a process that executes the code, and we need some kind of mainloop to do this, so we are using the GTK one.
Now lets create a Python class and its constructor:
class Main: def init(self):
Now, to the meat. First create a pipeline:
self.pipeline = gst.Pipeline(“mypipeline”)
Here you create a pipeline that you can reference in your Python script as self.pipeline
. The mypipeline
bit in the brackets is a name for that particular instance of a pipeline. This is used in error messages and the debug log (more on the debug log later).
Now lets create an element:
self.audiotestsrc = gst.element_factory_make(“audiotestsrc”, “audio”)
Here you create the audiotestsrc
element by using the element_factory_make()
method. This method takes two arguments – the name of the element you want to create and again, a name for that instance of the element. Now lets add it to the pipeline:
self.pipeline.add(self.audiotestsrc)
Here we use the add()
method that is part of the pipeline to add our new element.
Lets do the same for the alsasink
element:
self.sink = gst.element_factory_make(“alsasink”, “sink”) self.pipeline.add(self.sink)
With our two elements added to the pipeline, lets now link them:
self.audiotestsrc.link(self.sink)
Here you take the first element (self.audiotestsrc
) and use the link()
method to link it to the other element (self.sink
).
Finally, lets set the pipeline to play:
self.pipeline.set_state(gst.STATE_PLAYING)
Here we use the set_state()
method from the pipeline to set the pipeline to a particular state. There are a bunch of different states, but here we set it to PLAYING
which makes the pipeline run. Other pipeline states include NULL
, READY
and PAUSED
.
Finally, here is the code that create the Main
instance and runs it:
start=Main() gtk.main()
To run this script, set it to be executable and run it:
[email protected]:~$ chmod a+x gstreamertutorial-1.py [email protected]:~$ ./gstreamertutorial-1.py
You should hear the audible tone through your speakers. Press Ctrl-C to cancel it.
Setting properties
Right, lets now add a line of code to set a property for an element. Underneath the self.audiotestsrc = gst.element_factory_make("audiotestsrc", "audio")
line add the following line:
self.audiotestsrc.set_property(“freq”, 200)
This line uses the set_property()
method as part of the element to set a particular property. Here we are setting the freq
property and giving it the value of 200
. This property specifies what frequency the tone should play at. Add the line of code above (or download an updated file here) and run it. You can then change the value from 200
to 400
and hear the difference in tone. Again, use gst-inspect-0.10
to see which properties are available for that particular element.
You can change properties while the pipeline is playing, which is incredibly useful. As an example, you could have a volume slider that sets the volume
property in the volume
element to adjust the volume while the audio is being played back. This makes your pipelines really interactive when hooked up to a GUI. 🙂
Hooking everything up to a GUI
Right, so how do we get this lot working inside a GUI? Well, again, its fairly simple. This section will make the assumption that you know how to get a Glade GUI working inside your Python program (see this excellent tutorial if you have not done this before).
Now, go and download this glade file and this Python script. The Python script has the following code in it:
!/usr/bin/python
import pygst pygst.require(“0.10”) import gst import pygtk import gtk import gtk.glade
class Main: def init(self):
Create gui bits and bobs
self.wTree = gtk.glade.XML(“gui.glade”, “mainwindow”)
signals = { “on_play_clicked” : self.OnPlay, “on_stop_clicked” : self.OnStop, “on_quit_clicked” : self.OnQuit, }
self.wTree.signal_autoconnect(signals)
Create GStreamer bits and bobs
self.pipeline = gst.Pipeline(“mypipeline”)
self.audiotestsrc = gst.element_factory_make(“audiotestsrc”, “audio”) self.audiotestsrc.set_property(“freq”, 200) self.pipeline.add(self.audiotestsrc)
self.sink = gst.element_factory_make(“alsasink”, “sink”) self.pipeline.add(self.sink)
self.audiotestsrc.link(self.sink)
self.window = self.wTree.get_widget(“mainwindow”) self.window.show_all()
def OnPlay(self, widget): print “play” self.pipeline.set_state(gst.STATE_PLAYING)
def OnStop(self, widget): print “stop” self.pipeline.set_state(gst.STATE_READY)
def OnQuit(self, widget): gtk.main_quit()
start=Main() gtk.main()
In this script you basically create your pipeline in the constructor (as well as the code to present the GUI). We then have a few different class methods for when the user clicks on the different buttons. The Play and Stop buttons in turn execute the class methods which in turn just set the state of the pipeline to either PLAYING
(Play button) or READY
(Stop button).
Debugging
Debugging when things go wrong is always important. There are two useful techniques that you can use to peek inside what is going on in your pipelines within your GStreamer programs. You should first know how to generate a debug log file from your program. You do so by setting some environmental variables before you run your program. As an example, to run the previous program and generate a debug log called log, run the following command:
[email protected]:~$ GST_DEBUG=3,python:5,gnl*:5 ./gstreamertutorial.py > log 2>&1
This will generate a file called log that you can have a look into. Included in the file are ANSI codes to colour the log lines to make it easier to find errors, warnings and other information. You can use less
to view the file, complete with the colours:
[email protected]:$ less -R log
It will mention it is a binary file and ask if you want to view it. Press y
and you can see the debug log. Inside the log it will tell you which elements are created and how they link together.
Onwards and upwards
So there we have it, a quick introduction to GStreamer with Python. There is of course much more to learn, but this tutorial should get you up and running. Do feel free to use the comments on this blog post to discuss the tutorial, add additional comments and ask questions. I will answer as many questions as I get time for, and other users may answer other questions. Good luck!
…oh and I haven’t forgotten. I want to see everyone writing at least one tutorial like I said at the beginning of this article. 🙂
If you thought this was interesting, you might want to Join As a Member. This will ensure you never miss an article, you get access to exclusive member-only content, early-access to new projects and member-only events, and the possibility of winning free 1-on-1 workshops. It is entirely FREE and I will never sell your information or spam you (those people suck). Find out more here.
You should submit this to Gnome Journal.
We have a little article (in portuguese) obscenelly illustrated (not illustrated with obscenity) here (http://www.cin.ufpe.br/~cinlug/drupal/?q=node/59), and we’re planning for another one with code in python and ruby, so your article is of great help. Thanks.
You’re so right, Jono! We’ll do something about this, I’ve no doubt about that 🙂
Best regards…
MacSlow
Slick work man. Well done. Beard.
http://digg.com/programming/Getting_started_with_GStreamer_with_Python
Nice tutorial. Contrary to earlier believes, there doesn’t even seem to be any witchcraft involved in getting gstreamer to do anything. I feel inspired to try myself at a python powered transcription application that is a little better suited than amarok at playying interview data with a foot switch. Thanks Jono
Thanks Jono for making this clearer 🙂 If readers need not be scared of Gsteamer, do you not fear listeners might be scared by this “Beating Heart” release ?
jk, the tune is impressive.
Setanta – is there any chance you could translate that awesome looking tutorial into English? It would be awesome if you could. 🙂
spaetz – awesome – if this tutorial helps you get started with the incredible GStreamer framework, then I am happy. 🙂
Oh, and /me thwacks jb. 😛
Hi Jono,
Is it also possible to read the properties of the GStreamer elements and trigger certain things ? For eg: the element ‘videorate’ gives certain read only properties like ‘dropped frames’ & ‘duplicated frames’. Is it possible to read these values, and then trigger some other process?
Great work Jono, I’ve been looking for something like this for a long time, it’s helped me understand how gstreamer works a lot better than previously.
It would also be great if you’d follow this up with a tutorial dealing with more advanced stuff like metadata handling etc. (wink wink, nudge nudge :wink:)
The complete lack of documentation for gst-python is incredibly frustrating, and I’m sure it’s absence has caused a lot of potential media-related app developers to shy away from using the brilliant framework.
thanks thanks, http://www.afpy.org/python/forum_python/forum_general/488065637108
Great stuff! I was able to code a TV player (no tuning yet) using “v4lsrc” and “autovideosink” in python. Now I would like to draw some lines and text on the life video feed.
I suppose there is an “element” or “bin” that permits one to do this? Any ideas will be appreciated.
actually translating the c code of the official gst documentation into python without knowing c nor glib (nor python) works pretty well too! thx
minkwe – that is great news! I am really pleased! 🙂
Setanta – feel free to go ahead and translate the article. 🙂
could you give examples with decodebin please
Thanks
oops 😳 correction !!! could you give examples with bins and ghost pads please Thanks 🙂
flesse_bleu – when I get time I will see if I can do something. I still plan on making anothet tutorial about gnonlin.
To sum up, bins though, you first create one:
self.bin = gst.element_factory_make(“bin”, “mybin)
Then create elements like normal and add them to the bin with self.bin.add() and link them like normal.
To create the Ghost Pads, you need to do three things:
(1) Grab the pads from elements inside the bin that you want to map the ghost pads to. So if your elements in the bin are: foo ! bar ! boogle – you would need to grab the sink pad from foo and the src pad from boogle. You can do this with get_pad() for each of those elements: (2) You then need to create the Ghost Pads. This is done with the following command:
sinkpad = gst.GhostPad(“sink”, foopad) srcpad = gst.GhostPad(“sink”, booglepad)
In the above lines, foopad and booglepad are the pads you grabbed from the elements.
(3) Finally, add the Ghost Pads to the bin:
self.bin.add_pad(sinkpad) self.bin.add_pad(srcpad)
With that, you bin is complete. You can now add it where you need in a pipeline. 🙂
Hope this helps. 🙂
yes that helps me much thank you again the goal of my small project is to compile audio files various (mp3, flac, mpc….) in only one file ogg/mp3 but with a crossfade of 15 second between each. I don’t know which module of gstreamer i must use , so looking for a solution .
flesse_bleu – to do this I recommend you use gnonlin. Gnonlin will look after decoding the source files, and gnonlin includes a special GnlOperation that can be used to process bits of audio – such as a adding a fade to a particular portion. You would actually put a volume element inside a GnlOperation (a GnlOperation is a bin) and then you can use a GstControl to set the start and end point of the fade.
I plan on writing up a quick Gnonlin tutorial over the next few weeks so stay tuned. 🙂
s/Gstcontrol/GstController 🙂
🙂 yes yes yes 🙂 I don’t move 😆
So it was about Gnonlin-love… 🙄
Aneglus – ?
Hi Jono
I have to look in dapper. I do not find GnlOperations
gst-inspect-0.10 | grep gnl* gnonlin: gnlfilesource: GNonLin File Source gnonlin: gnlcomposition: GNonLin Composition gnonlin: gnlsource: GNonLin Source
flesse – yeah GnlOperations are a feature in the current developer version of Gnonlin and will be in the next official release. So, you need to build Gnonlin yourself from CVS to try them out.
I think maybe it’s time for jono to write “The Definitive Guide to Gstreamer” published by O’Reilly Corp. 😆
No, really…I’m serious
Hey Jono,
nice work!!! I would like to add the article to mono-project.com with examples ported to C#. Is that ok with you?
Thanks
Khaled – sure! Just add attribution. 🙂
minkwe: textoverlay will overlay text and timeoverlay will overlay the time.
Good tutorial, there’s certainly a serious lack of documentation related to using GStreamer in Python.
Now what I really need is the same tutorial but dealing with video, specifically how to get it to be displayed within a GTK window/object. I’ve been trying to figure this out for months…
And yes, Jono writing an O’Reilly guide to GStreamer is a very, very good idea!
Thank you, thank you, thank you… I had been looking for info o python-gstreamer for quite a while and this was the first thing that I actually found helpful.
Now, I saw this the day you posted it (or maybe the next one), so if I were only writing to thank you, I’d have done so quite earlier. (Not that I didn’t want to do so, I’m just lazy, and believed you could live without having a total stranger thank you :P)
So, what I wanted to ask to you (or to anyone else that reads this and knows the answer, of course 😉 is:
Is there a better/more-recommended way of seeing if a pad has certain caps than this:
?… I bet there is 😉
Cheers Jono, absolutely fantastic intro to gstreamer python stuff. So easy I infact managed to make a cool siren sound with no python knowledge!! wooo woooo wooo woooo wooo etc. Brings back memories of BEEP on the zx speccy!
Fantastic guide, even if you’re not looking to do anything musical, it’s a great intro to some basic ‘real-world’ python stuff. I like it a lot.
Infact, it’s inspired me to do a lot of python. Python is blatantly cool.
Wow. This is an amazing tutorial. I’d been sure before that the GStreamer Framework is a kewl thing, and your tutorial made me sure even more. %) I guess now I can finally try to get rid of the mecoder-sox-blah-blah-blah pipeline in my app, as that’s really hard to manage from the app…
Woo! 🙂
I have my gnonlin one cued up too. 🙂
Now this is how a good startup tutorial should be written! Thanks! 😀
great tut, but I was wondering how I can set a property of a plugin in python. for example I am thinking this
fileout = gst.element_factory_make(“filesink”, “sink”) fileout.location = “/home/Desktop/test.ogg”
but it dosn’t seem to work
Hey Jono,
I was planning on doing a small rhythmbox plugin… I’m trying to dump the audio stream of rhythmbox to a file… It seams to be relatively simple, using filesink.
just like kthakore did it above
fileout = gst.element_factory_make(â€filesinkâ€, “sinkâ€) fileout.location = “/home/Desktop/test.oggâ€
but for some reason the ogg never gets made??
And how do I grab the audio stream of rhythmbox?
Thanks for the tutorial. Exactly what I was looking for.
Amazing, tutorial. I can’t tell you how long I spent on google trying to find some documentation about gstreamer in python… Luckily I found this tutorial. I’m amazed that you got this far without any documentation.
Good introduction! It would be nice to see the code to make the sound file play from the GUI, or even in the CLI in python scripts…:cool:
Thanks a bunch! Seriously, this is an awesome tutorial!
I have run some testing python scripts now and I love gstreamer. The only thing I have not found out, is how I can give gstreamer multi channel audio inputs. I have 6 audio files encoded with FLAC, on for every speaker.
I like to play with the files from xiph.org, wich are very high quality: http://media.xiph.org/BBB/
Regards
Can anybody tell that what is difference between Pipeline and Bin?
Download link for the full script is broken 🙁
less’s -R option doesn’t always work very well because it doesn’t know how to account for the spacing of many control characters properly
The file “gui.glade” needed for one of the examples seems to have gone awol. Any chance of restoring it? Thanks!
Hi Jono, Thanks for this great startup tutorial. Just a little problem I want you to know. The links you provide to download the samples code just lead to nowhere (404 – not found).
hi i tried to run gstreamertutorial1.py which is frst tutorial or code , but after runnning it I couldn’t hear nay sound,I checked log file it shows evrything fine..can u plz tell me what can be d problem
After running the code, u couldnt hear any sound ! ok… let me think… hmmmmmmm…….. yeah ! got it ! See, i m sure u havnt turn on ur speakers! Plz, put it on ! Then try ! If problem still exists, then call me, mala baryaach prakarche sound tondane kadhhta yetat ! Wish u all the best ! [:)]
Dude, seriously, thanks. If you’re ever near Dallas, beers are on me.
hello anyone knows the method to create dynamic pads? I use mpegtsdemux plugin to demultiplexer audio and video,so I need to create a dynamic pad for link mpegtsdemux pad and queue ( queuevideo, queueaudio) here is my pipline I must coded in python or C : gst-launch filesrc location=/home/hamadi/Bureau/test.ts ! mpegtsdemux name=demux program-number=12041 ! queue ! mpeg2dec ! ffmpegcolorspace ! xvimagesink demux. ! queue ! mad ! audioconvert ! audioresample ! alsasink thank you
(gst-launch-0.10:6312): GLib-WARNING **: g_set_prgname() called multiple times Setting pipeline to PAUSED … Pipeline is PREROLLING … ERROR: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0: A text/html decoder plugin is required to play this stream, but not installed. Additional debug info: gstdecodebin.c(986): close_pad_link (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0: No decoder to handle media type ‘text/html’ ERROR: pipeline doesn’t want to preroll. Setting pipeline to NULL … Freeing pipeline …
Returns an Error Like this. Can you help ?
Probably you don’t have the ogg decoder. Try it with a mp3 file.
Hello .. i am doing a project in building a media player using gstreamer.. I find this very very useful to start with .. do you have any tutorial or example for building a pipeline ..
many thanx
What if I wanted to display video rather than play audio?
??? I want recording from microphone and use “tcpclientsink” to send the voice to remote computer. The problem is, how can I get the data from microphone? thanks!
fixed! using command line like: gst-launch-0.10 alsasrc ! audiorate ! wavenc ! filesink location=testsound.wav
Properties are set with set_property() method. Try fileout.set_property(“file,”/home/desktop/test.ogg”)
Thanks Jono ..It is awesome..Really helpful for me tounderstand GStreamer 🙂
Great post .
Thank you for sharing and post this article man, good job and keep going ….
Wonderfull guide! Congratulations! You explain easily what others can not!
I really enjoyed the tutorial and I’ll try to post one like this one in the near future.
Thank you for always doing that for our rookies….
This debugging stuff is magical. Set the envars and boom straight to stdout. Love it.
Thank You, Extremely helpful. 🙂
I would prefer if this was rewritten to use the new GObject-Introspection libraries. pygst is deprecated.
for my post below the indentation didn’t work so put 4 spaces before ‘class Main’ and 8 spaces before ‘def_init_(self) and for ever line after until ‘self pipeline.set_state’
I got error: No module named pygst but I have installed everything can you point me in right direction?
I have the same error. Did you ever fix it?
Try installing python-gst0.10 via apt-get.
For anyone that’s interested, here’s a Python 3 compatible update, using GStreamer 1.0 and adjustments for gi introspection.
Code needs a bit of a cleanup, but the general idea is there.
You’ll want to copy the glade data out of the comment block in the bottom half of the file and name it ‘sound.glade’ to get things rolling.
Tested on an Ubuntu (Gnome) 14.04.3 virtual machine.
http://pastebin.com/7NLWZCMi
This is a big up +1; huh! This article has given me a knowledge that i never thought of. I must dive in gstreamer to look for details