[Bf-committers] GStreamer

Roger Wickes rogerwickes at yahoo.com
Tue Aug 25 19:52:28 CEST 2009


I find this thread very timely. Presently I am working alot with VLC at the command line level, and it uses the same philosophy, in that you stream video/images through filters and processors, and then either save it in a file, or direct it to an udp: such as http: or mobile device or some other transport/viewer/broadcaster. I find this concept easy to understand since it is, in principle, the same as the channel layering in the sequencer, where you start with channel 1, mix 2, mix 3, etc., and in principle the same as nodes, where you work with streams of video to a final output solution. 

In addition to saving video in a file, would using GStreamer allow us to stream video out to other output solutions, like VST? In VST, you select a mux (container) and then appropriate video and audio codecs, just like FFMPeg. Does GStreamer work the same way?

 ----------------
Sent by Roger Wickes for intended recipient. If you are not the intended recipient, please delete this message and contact Mr. Wickes immediately.




________________________________
From: neXyon <nexyon at gmail.com>
To: bf-blender developers <bf-committers at blender.org>
Sent: Tuesday, August 25, 2009 1:14:13 PM
Subject: [Bf-committers] GStreamer

Greetings!

Due to the latest discussion about ffmpeg also the question about 
alternatives came up and such I got to know that there has been a 
discussion about using GStreamer, but nobody can remember the exact 
reason why we finally didn't use it, so I started investigating a bit if 
we could use it.

The design of GStreamer is basically that you put so called elements 
into a pipeline; example: filesource file.ogg -> oggdemuxer -> 
vorbisdecoder -> audioconverter -> audioresampler -> alsaoutput. So 
first I thought we have to write our own elements (which should be 
easily possible with the plugin system), but in the #gstreamer channel 
on IRC they told me that there are App* elements to let applications use 
GStreamer without having to write such, so the integration should be 
easily possible.

About losing functionality or not if we use gst instead of ffmpeg, I 
found out that there is gst-ffmpeg, so if we include that with all 
plattforms, we have no loss of supported codecs.

Moreover gst also supports QT and a hell lot of other libs (ogg, 
DirectShow (Windows), ...), so I would propose to use ffmpeg and QT 
through gst in case we decide to use gst, that would be a big advantage 
for the video code in blender as we'd have to maintain less. For example 
some ppl poked me about writing a backend for QT too for audaspace, if 
we'd have a gst backend, we could drop the ffmpeg and sndfile backend 
and I don't have to write the QT backend as we have all in one then via gst.

I then asked questions concerning the deployment. It's clear that it 
should be possible to still have the blender zip run without the need of 
installing dependencies and according to the ppl in #gstreamer thats not 
a problem, it's even possible to link gst statically (they only told me 
that we have to take care about GPL compatibility of the stuff we link). 
The size of blender sholdn't grow to much too, as the core of gst is 
pretty small. It depends on how many plugins we ship and gst-ffmpeg for 
example shouldn't add to much, we currently ship the ffmpeg libs anyway.

Let the discussion begin!

Regards,
Jörg
_______________________________________________
Bf-committers mailing list
Bf-committers at blender.org
http://lists.blender.org/mailman/listinfo/bf-committers



      


More information about the Bf-committers mailing list