[Bf-committers] Thoughts on Plugins...

Jonathan Merritt j.merritt at pgrad.unimelb.edu.au
Mon Jun 20 16:10:26 CEST 2005

Austin Benesh wrote:

> The plugin system could use some new methods. One proposal that I have
> been giving some thought to is the use of the plugin system for shade
> models.

I think this is entirely possible.  However, thought should be given to:
    - User interface.
    - Versatility of the shader API.
    - How the system would interface with the current Blender renderer
(mostly in rendercore.c).
    - Animation of shader parameters.
    - Scripting of shaders.
    - etc?

Whenever I have considered this problem, I always end up with a system
similar to RenderMan shaders (maybe I'm just biased).

An important question to ask, though, is whether it's best to try to
build a system like this into Blender itself, or to incorporate external
tools that already have programmable shading?

Also consider that numerous effects that you might want to achieve can
probably most efficiently (and with most control) be implemented in
multiple rendering passes.  Examples I can think of that I have
personally used are:
    1. Baking of an ambient occlusion pass into static texture maps. 
(It didn't work 100% correctly for the project I used it in, but the
theory was great! :-)
    2. Edge detection for toon shading using "region flags".
    3. Post-processed glow, again using shader output to cue the glow
    4. Post-processed custom lenses (eg: an angular fisheye lens) using
environment map look-ups.  (see http://grub.homelinux.org/afel )
    5. Interaction between animated objects and surfaces, cued by data
from a separate render pass (eg: tracks left by a tank in a sandy
surface.  Also, the "cutting through steel" Blender tutorial uses this
very technique! :-)

My personal feeling is that we are best putting effort into extending
Blender for these more professional scenarios that often require at
least some custom setup.  My current plan is to extend the Python
RenderMan wrapper classes that I've been using for my own projects, and
then Open Source them.  I'm not against trying to implement a shader
plugin system, but I think it will still be limited by the
"click-to-render" philosophy.  I'd say, if you want the best results,
why don't we concentrate on frameworks to enable the "render pipeline"
type philosophy instead?

There are numerous advantages to this mindset beyond simple shading
effects.  It could, for example:
    - improve the "modularity" of Blender, allowing it to work more
easily alongside Maya, Lightwave, etc., on different projects
    - make it much easier to use procedural elements (particle systems,
professional grass and hair, asteroid fields, you name it...)

What I'm thinking of is more of a "plug-out" system than a "plug-in"
system. :-)  What's needed is a way to extend Blender in a fully
customizable fashion, while retaining the ability to fast-preview your
work and quickly edit things in OpenGL.

My own "dream item" is the concept of a "Blender Python Object".  Such
an object would have:
    - the power to render itself in 3D views
    - the ability to define editmode-type interactions
    - its own actions, connected to button panels and menus
    - the ability to define itself in terms of a sub-set of built-in
primitives for rendering
    - any additional functions that the developer may devise (eg:
RenderMan output functionality)
But... maybe this sounds too much like "Blender3"? :-)

Jonathan Merritt.

More information about the Bf-committers mailing list