[Bf-committers] Video texture for the BGE

Benoit Bolsee benoit.bolsee at online.be
Fri Oct 31 09:56:24 CET 2008


Hi all,

As some of you might know already I've been working on integrating Ash's
video texture plugin in the BGE. I'm now about to release the code to
trunk and I wanted to inform the devs of the options I've taken before I
break every compilation systems down ;-)
Sorry if this Email is a bit long, but there is a lot to say.

The plugin is a Python API, there are no logic bricks yet. There are two
APIs in the original plugin, the old and the new one. I've only ported
the new API. I will not go over the API now, that's not the purpose of
this mail. You can find Ash's original documentation here:
 http://home.scarlet.be/~tsi46445/blender/blendVideoTex.html
I've kept the same name for the module: blendVideoTex, but I'm not happy
with it: it doesn't fit withh other modules name. Maybe VideoTexture is
better. What do you think?

My contribution was to port the code to trunk and implement a
VideoFFmpeg Image class. The features that are currently supported are:
- Video file to texture
- Video http streaming to texture
- Video capture to texture (via VideoForWindows/VideoForLinux/DV1394)
- Viewport to texture 
- Buffer to texture
- Mixer 
- All the filters: bluescreen, gray, color matrix, color level, normal
map
The features that are not yet supported:
- Render to texture

I've done all my tests in Windows. I'll try to test on Linux but I
mainly count on the community to compile and test in Linux. I've updated
the MSVC project files and also the Sconscript/cmake/makefile but
without testing them.

To support all these features I had to tweak ffmpeg a bit:
- register capture devices in do_init_ffmpeg()
- enable network support in ffmpeg by removing --disable-network
configure option when compiling ffmpeg
I've recompiled the ffmpeg windows DLL and I will update them in SVN but
the same should be done in the Linux repository of ffmpeg. 
Maybe it's not acceptable to enable network support in ffmpeg (for
security reason?). If so let me know, but it would be a shame to drop
support for http streaming as ffmpeg supports it.

The plugin depends on the Python library numpy
(http://numpy.scipy.org/). Numpy is a pure python library, it's C side
is a bunch of headers that I've added in
lib/windows/python/include/python2.5/numpy. The same should be done on
all other python versions and all other OS. Only the
numpy.core.multiarray package is used by the plugin and only if you load
image to user memory. I'm not sure how to deal with this dependency. It
seems that the user will need to install the numpy package or maybe we
can put the multiarray.pyd file in blender distribution. What do you
think?

What next?
After the commit and all compilation problems are sorted out, I'll add
support for render to texture. Then, some nice features can be added
rather easily:
- Viewport to video file
- Render to video file
- Viewport to video streaming
- Video logic bricks
- Drawing API

I'll commit today or this weekend unless there is a strong reason not to
do so. Let me know.

/benoit



More information about the Bf-committers mailing list