[Bf-committers] Render API Design

Bobby Parker sh0rtwave at gmail.com
Sun Jun 3 14:58:56 CEST 2007

> > On 3 Jun, 2007, at 1:55, Bobby Parker wrote:
> > When exporting data to renderman, I fetch all the objects in the
> scene first
> This is from start already a wrong approach. You should not want to
> fetch objects yourself, but instead ask Blender to prepare renderable
> data in a way that:

No argument! Believe me, I'm eager to change that. It's that way for
purposes of necessity, and my how I hate it so. The way I envision the
Render API doing it (and how I would build the subsequent exporter) would be
that Blender would iterate the objects in the scene for me, and provide me
with a hook or a callback of sorts that my exporter could then control how
the object's data is exported.

When the render button is pushed, the Render API would invoke a "preamble"
function in the exporter that sets up the initial scene stuff, like render
options(output location, format, initial world transform, etc.)...I think
that point would be the time to tell blender how to prepare the data for you
(i.e. what modifiers to disable, what kind of access to use for certain
objects, e.g. metaballs where you either access metaball data, or you get
the converted mesh for the whole object).

After all that initial setup, I would then expect the API to get to the
business of providing objects to the renderer. From an organizational
standpoint, I'd provide lights first, then worry about all of the other
objects, since the general convention in renderman is to setup the scene
lighting first, and then specify objects.

So the general flow would be simple (This is biased towards renderman):

Render Button pushed ->
      scene preamble: Blender supplies camera transform,  and anything else
that's needed to setup the scene.
      lights: for each light in the scene, blender invokes a certain call in
the exporter.
     object iteration: for each object in the scene, Blender invokes a
certain call in the exporter to give it the data for that particular object.

I view the Render API as not so much an interface for an exporter to talk to
blender, but rather the other way around, for Blender to actually control an
exporter, and provide it data that's tailored for how that exporter wants
it. Effectively it's a series of callbacks that happen at defined stages of
the rendering process.

This first requirement is crucial te get right. Remember; we want to
> ensure other renderers support Blender, not the other way around.

Of course. I think the Renderman case is especially well suited to this
because of the wide range of feature support that the various renderers
support (or don't support as the case may be).

Simple picture: if we rip apart the current render module in two
> sections (representing the two requirements above), you are close to
> having a render API.
> That is why the current yafray code is interesting to look at.

No argument. I looked at that myself.

Also note that for Blender's own render future I like to get bucketing
> work, and ways to create detailed geometry on the fly during rendering,
> and free it. That's why an API based on "Pulling object data" is
> interesting. It can do this by default on several levels;

You might have a look at the Renderman method of using procedural and
predefined geometry via ReadArchive.

2- Minimal geometry-specific data description (i.e. Mesh cage, Nurbs
> points)

YES! Precisely! This is what I mean when I'm babbling about Metaballs. Of
course, some of those renderers don't support Blobbies (the Renderman
"metaball") so I'd have to resort to getting the converted mesh.

Probably as  a first stab, the approach should be to figure out in a general
sense what the interface to the exporters would look like.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.blender.org/pipermail/bf-committers/attachments/20070603/e3abac8c/attachment.htm 

More information about the Bf-committers mailing list