Take the stress out of building communities with the FREE ‘People Powered’ Book Club.

11 Week Program. Read a chapter each week. Join for a weekly call with me to discuss the content, how to apply it, and grow your skills. All part of a passionate, engaging community.

Registration Closes:
2nd Oct 2020

You know, there are tonnes of undocumented things out there. Really, really cool technologies that should be getting used more are not getting used as much because there lacks decent docs. And, to make matters worse, the developers naturally just want to get on and write the software. So, I would like to urge everyone who reads this (and I am thinking you ‘orrible lot on Planet GNOME in particular) should write an article about something that you have discovered that isn’t particularly well documented. This could be a technique, a technology, skill or something. Lets get some Google juice pumping and get some extra docs to help people get started. 🙂

So, with this in mind, I am going to write a simple first guide to getting started with GStreamer using the excellent Python bindings. This tutorial should be of particular interest if you want to hack on Jokosher, Pitivi or Elisa as they, like many others, are written in Python and use GStreamer.

Ready? Right, lets get started with the pre-requisites. You will need the following:

  • GStreamer 0.10
  • Python
  • PyGTK (often packaged as python-gtk2)

You will also need a text editor. Now, some of you will want to have a big ‘ole argument about which one that is. Come back in four hours and we can continue. 😛

An overview

So, what is GStreamer and how do they help you make multimedia applications? Well, GStreamer is a multimedia framework that allows you to easily create, edit and play multimedia by creating special pipelines with special multimedia elements.

GStreamer has a devilishly simple way of working. With GStreamer you create a pipeline, and it contains a bunch of elements that make that multimedia shizzle happen. This is very, very similar to pipelines on the Linux/BSD/UNIX command line. As an example, on the normal command line you may enter this command:

[email protected]:~$ ps ax | grep “apache” | wc -l

This command first grabs a process listing, then returns all the processes called “apache” and then feeds this list into the wc command which counts the number of lines with the -l switch. The result is a number that tells you how many instances of “apache” are running.

From this we can see that each command is linked with the | symbol and the output of the command on the left of the | is fed into the input on the command on the right of the |. This eerily similar to how GStreamer works.

With GStreamer you string together elements, and each element does something in particular. To demonstrate this, find an Ogg file (such as my latest tune 😛 ), save it to a directory, cd to that directory in a terminal and run the following command:

[email protected]:~$ gst-launch-0.10 filesrc location=jonobacon-beatingheart.ogg ! decodebin ! audioconvert ! alsasink

(you can press Ctrl-C to stop it)

When you run this, you should hear the track play. Lets look at what happened.

The gst-launch-0.10 command can be used to run GStreamer pipelines. You just pass the command the elements you want to play one by one, and each element is linked with the ! symbol. You can think of the ! as the | in a normal command-line list of commands. The above pipeline contains a bunch of elements, so lets explain what they do:

  • filesrc – this element loads a file from your disk. Next to the element you set its location property to point to the file you want to load. More on properties later.
  • decodebin – you need something to decode the file from the filesrc, so you use this element. This element is a clever little dude, and it detects the type of file and automatically constructs some GStreamer elements in the background to decode it. So, for an Ogg Vorbis audio file, it actually uses the oggdemux and vorbisdec elements. Just mentally replace the decodebin part of the pipeline for oggdemux ! vorbisdec and you get an idea of what is going on.
  • audioconvert – the kind of information in a sound file and the kind of information that needs to come out of your speakers are different, so we use this element to convert between them.
  • alsasink – this element spits audio to your sound card using ALSA.

So, as you can see, the pipeline works the same as the command-line pipeline we discussed earlier – each element feeds into the next element to do something interesting.

At this point you can start fiddling with pipelines and experimenting. To do this, you need to figure out which elements are available. You can do this by running the following command:

[email protected]:~$ gst-inspect-0.10

This lists all available elements, and you can use the command to find out details about a specific element, such as the filesrc element:

[email protected]:~$ gst-inspect-0.10 filesrc

More about GStreamer

OK, lets get down and dirty about some of the GStreamer terminology. Some people get quite confused by some of the terms such as pads and caps, not to mention bins and ghost pads. It is all rather simple to understand when you get your head around it, so lets have a quick run around the houses and get to grips with it.

We have already discussed what a pipeline is, and that elements live on the pipeline. Each element has a number of properties. These are settings for that particular element (like knobs on a guitar amp). As an example, the volume element (which sets the volume of a pipeline) has properties such as volume which sets the volume and mute which can be used to mute the element. When you create your own pipelines, you will set properties on a lot of elements.

Each element has virtual plugs in which data can flow in and out called pads. If you think of an element as a black box that does something to the information that is fed into it, on the left and right side of the box would be sockets in which you can plug in a cable to feed that information into the box. This is what pads do. Most elements have an input pad (called a sink and an output pad called a src). Using my l33t ASCII art mad skillz, this is how our pipeline above looks in terms of the pads:

[src] ! [sink src] ! [sink src] ! [sink]

The element on the far left only has a src pad as it only provides information (such as the filesrc). The next few elements take information and do something to it, so they have sink and src pads (such as the decodebin and audioconvert elements), and the final element only receives information (such as the alsasink). When you use the gst-inspect-0.10 command to look at an element’s details, it will tell you which pads the element has.

So, we know we have pads, and data flows through them from the first element on the pipeline to the last element, and now we need to talk about caps. Each element has particular caps and this says what kind of information the element takes (such as whether it takes audio or video). You can think of caps as the equivalent rules on a power socket that says that it takes electricity of a particular voltage.

Lets now talk about bins. A lot of people get confused about bins, and they are pretty simple. A bin is just a convenient way of collecting elements together into a container. As an example, you may have a bunch of elements that decode a video and apply some effects to it. To make this easier to handle, you could put these elements into a bin (which is like a container) and then you can just refer to that bin to in turn refer to those elements. As such, the bin becomes an element. As as an example, if your pipeline was a ! b ! c ! d, you could put them all into mybin and when you refer to mybin, you are actually using a ! b ! c ! d. Cool, huh?

Finally, this brings us onto ghost pads. When you create a bin and shove a bunch of elements in there, the bin then becomes your own custom element which in turn uses those elements in the bin. To do this, your bin naturally needs its own pads that hook up to the elements inside the bin. This is exactly what ghost pads are. When you create a bin, you create the ghost pads and tell them which elements inside the bin they hook up to. Simple. 🙂

Writing some code

To make this GStreamer goodness happen in a Python script, you only need to know a few core skills to get started. These are:

  • Create a pipeline
  • Create elements
  • Add elements to the pipeline
  • Link elements together
  • Set it off playing

So, lets get started, we are going to create a program that does the equivalent of this:

[email protected]:~$ gst-launch-0.10 audiotestsrc ! alsasink

Here we use the audiotestsrc element which just outputs an audible tone, and then feed that into an alsasink so we can hear it via the sound card. Create a file called gstreeamertutorial-1.py and add the following code:


import pygst pygst.require(“0.10”) import gst import pygtk import gtk

class Main: def init(self): self.pipeline = gst.Pipeline(“mypipeline”)

self.audiotestsrc = gst.element_factory_make(“audiotestsrc”, “audio”) self.pipeline.add(self.audiotestsrc)

self.sink = gst.element_factory_make(“alsasink”, “sink”) self.pipeline.add(self.sink)



start=Main() gtk.main()

Download the code for this script here.

So, lets explain how this works. First we import some important Python modules:

import pygst pygst.require(“0.10”) import gst import pygtk import gtk

Here the GStreamer modules (pygst and gst) are imported and we also use the gtk modules. We use the GTK modules so we can use the GTK mainloop. A mainloop is a process that executes the code, and we need some kind of mainloop to do this, so we are using the GTK one.

Now lets create a Python class and its constructor:

class Main: def init(self):

Now, to the meat. First create a pipeline:

self.pipeline = gst.Pipeline(“mypipeline”)

Here you create a pipeline that you can reference in your Python script as self.pipeline. The mypipeline bit in the brackets is a name for that particular instance of a pipeline. This is used in error messages and the debug log (more on the debug log later).

Now lets create an element:

self.audiotestsrc = gst.element_factory_make(“audiotestsrc”, “audio”)

Here you create the audiotestsrc element by using the element_factory_make() method. This method takes two arguments – the name of the element you want to create and again, a name for that instance of the element. Now lets add it to the pipeline:


Here we use the add() method that is part of the pipeline to add our new element.

Lets do the same for the alsasink element:

self.sink = gst.element_factory_make(“alsasink”, “sink”) self.pipeline.add(self.sink)

With our two elements added to the pipeline, lets now link them:


Here you take the first element (self.audiotestsrc) and use the link() method to link it to the other element (self.sink).

Finally, lets set the pipeline to play:


Here we use the set_state() method from the pipeline to set the pipeline to a particular state. There are a bunch of different states, but here we set it to PLAYING which makes the pipeline run. Other pipeline states include NULL, READY and PAUSED.

Finally, here is the code that create the Main instance and runs it:

start=Main() gtk.main()

To run this script, set it to be executable and run it:

[email protected]:~$ chmod a+x gstreamertutorial-1.py [email protected]:~$ ./gstreamertutorial-1.py

You should hear the audible tone through your speakers. Press Ctrl-C to cancel it.

Setting properties

Right, lets now add a line of code to set a property for an element. Underneath the self.audiotestsrc = gst.element_factory_make("audiotestsrc", "audio") line add the following line:

self.audiotestsrc.set_property(“freq”, 200)

This line uses the set_property() method as part of the element to set a particular property. Here we are setting the freq property and giving it the value of 200. This property specifies what frequency the tone should play at. Add the line of code above (or download an updated file here) and run it. You can then change the value from 200 to 400 and hear the difference in tone. Again, use gst-inspect-0.10 to see which properties are available for that particular element.

You can change properties while the pipeline is playing, which is incredibly useful. As an example, you could have a volume slider that sets the volume property in the volume element to adjust the volume while the audio is being played back. This makes your pipelines really interactive when hooked up to a GUI. 🙂

Hooking everything up to a GUI

Right, so how do we get this lot working inside a GUI? Well, again, its fairly simple. This section will make the assumption that you know how to get a Glade GUI working inside your Python program (see this excellent tutorial if you have not done this before).

Now, go and download this glade file and this Python script. The Python script has the following code in it:


import pygst pygst.require(“0.10”) import gst import pygtk import gtk import gtk.glade

class Main: def init(self):

Create gui bits and bobs

self.wTree = gtk.glade.XML(“gui.glade”, “mainwindow”)

signals = { “on_play_clicked” : self.OnPlay, “on_stop_clicked” : self.OnStop, “on_quit_clicked” : self.OnQuit, }


Create GStreamer bits and bobs

self.pipeline = gst.Pipeline(“mypipeline”)

self.audiotestsrc = gst.element_factory_make(“audiotestsrc”, “audio”) self.audiotestsrc.set_property(“freq”, 200) self.pipeline.add(self.audiotestsrc)

self.sink = gst.element_factory_make(“alsasink”, “sink”) self.pipeline.add(self.sink)


self.window = self.wTree.get_widget(“mainwindow”) self.window.show_all()

def OnPlay(self, widget): print “play” self.pipeline.set_state(gst.STATE_PLAYING)

def OnStop(self, widget): print “stop” self.pipeline.set_state(gst.STATE_READY)

def OnQuit(self, widget): gtk.main_quit()

start=Main() gtk.main()

In this script you basically create your pipeline in the constructor (as well as the code to present the GUI). We then have a few different class methods for when the user clicks on the different buttons. The Play and Stop buttons in turn execute the class methods which in turn just set the state of the pipeline to either PLAYING (Play button) or READY (Stop button).


Debugging when things go wrong is always important. There are two useful techniques that you can use to peek inside what is going on in your pipelines within your GStreamer programs. You should first know how to generate a debug log file from your program. You do so by setting some environmental variables before you run your program. As an example, to run the previous program and generate a debug log called log, run the following command:

[email protected]:~$ GST_DEBUG=3,python:5,gnl*:5 ./gstreamertutorial.py > log 2>&1

This will generate a file called log that you can have a look into. Included in the file are ANSI codes to colour the log lines to make it easier to find errors, warnings and other information. You can use less to view the file, complete with the colours:

[email protected]:$ less -R log

It will mention it is a binary file and ask if you want to view it. Press y and you can see the debug log. Inside the log it will tell you which elements are created and how they link together.

Onwards and upwards

So there we have it, a quick introduction to GStreamer with Python. There is of course much more to learn, but this tutorial should get you up and running. Do feel free to use the comments on this blog post to discuss the tutorial, add additional comments and ask questions. I will answer as many questions as I get time for, and other users may answer other questions. Good luck!

…oh and I haven’t forgotten. I want to see everyone writing at least one tutorial like I said at the beginning of this article. 🙂

If you thought this was interesting, you might want to Join As a Member. This will ensure you never miss an article, you get access to exclusive member-only content, early-access to new projects and member-only events, and the possibility of winning free 1-on-1 workshops. It is entirely FREE and I will never sell your information or spam you (those people suck). Find out more here.

Pin It on Pinterest

Share This