[Bf-committers] Renderman

Ton Roosendaal ton at blender.org
Fri Nov 11 14:02:28 CET 2005

Hi Jon,

Thanks for the very good sumary below. :)
Also as Robert Wenzlaff replied, the proper solution might be to  
refactor the entire convertBlenderScene.c (which does generate blender  
render data), and turn into a more structured approach with hooks for  
exporting in it. I wouldn't advise to use python for this, render data  
is not per-definition accessible for python. This goes for a metaballs,  
hair strands, particles, modifiers, and so on. It is also very  
important that a framework is created that 100% correctly puts all  
objects in the right timing and with the correct transformations and  
deformations set.

Related to this topic is that Blender itself could use object-level  
caching (bucketing) for rendering as well... meaning it will only  
generate the renderfaces when a specific object is accessed (visible or  
being traced). Such a structure I would love to see, but would mean a  
giant cleanup of Blender's render system too.

So; here we get stuck a bit between quality demands and available  
developers... for a new developer I wouldn't really advise to do this,  
although I'd be happy to review design proposals for a refactored  
I also might tackle a couple of forementioned issues for Orange though,  
at this time that's unknown...


On 4 Nov, 2005, at 2:12, Jonathan Merritt wrote:

> Hi Everyone,
> Some of my core notes on this topic that have accrued over time... :-)
> 1. Before getting started, be clear on what the goals of "the  
> exporter" are.  For example, typically, you'd want to use RenderMan  
> instead of (say) the internal renderer for:
>    - the RenderMan shading engine (custom shaders, micropolygon  
> displacement, etc.)
>    - greater control over your render pipeline (eg: custom render  
> passes for AO, etc.)
>    - greater control over render parameters (shading rate, sampling,  
> etc.)
>    - camera effects (motion blur, DOF, etc.)
> In short, most of the RenderMan advantages are related to its  
> versatility.  See Paul's presentation from the Blender conference:
>    http://www.blender.org/cms/Paul_Gregory.689.0.html
> 2. The requirements of a RenderMan exporter are different (in many  
> ways a *superset*) of the requirements of Yafray.
>    - RenderMan is best given access to underlying primitives to be  
> rendered, NOT the VlakRen ("render faces") that Yafray uses.  I CAN'T  
> emphasise that enough!!!!!!  The reason is that for primitives like  
> subdiv surfaces or NURBS, a RenderMan renderer will, overall, be able  
> to make better decisions about the level-of-detail to use in dicing  
> out these primitives than Blender (or the user) can.
>    - For motion blur, there are two types: transformation motion blur  
> and object motion blur.  Transformation motion blur is relatively  
> simple, but object motion blur (in which the individual vertices of  
> the object can move) demands consistent topology within all motion  
> sub-frames.  There needs to be a way to guarantee consistent sub-frame  
> topology *within* Blender (anything done to check it on the exporter  
> side can only ever be a rough and nasty kludge).
>    - For the shading engine, you need to consider more complex cases  
> from the outset, especially if you are trying to "simulate" Blender's  
> internal shading engine.  Consider the case of multiple applied  
> textures, and figure out a way of handling the data that needs to be  
> made available from the start.  In summary, although writing a one-off  
> shader to simulate literally any one Blender material is uber-simple,  
> writing a generic shader to simulate all Blender materials is  
> unbelievably difficult.  I'd recommend not trying to simulate  
> Blender's internal shaders (except for lights and maybe a very simple  
> surface shader), but instead allow custom shaders to be applied.  For  
> custom shaders, though, you need to consider how surface parameters  
> (vertex/varying colors, uv coords, etc.) are going to be named and  
> passed.
> 3. Methodology.  Two main options have come up over time:
>    - Expand BPython to make it better suited for render export (eg:  
> add things to query consistency of mesh topology, etc.) and write a  
> Python exporter.  In this case, a "one-click" exporter should be  
> written on top of a well thought-out base API (written in Python).   
> The base API could then be used by anyone needing more advanced  
> features (custom render passes, etc.).  The lack of "underlying  
> infrastructure" is my main criticism of the existing Python exporters.  
>  Also, use something like cgkit:
>    http://cgkit.sourceforge.net/
> to make the RenderMan API calls from Python.  DON'T DON'T DON'T  
> hard-code the API calls as strings to generate a RIB file!  Tempting  
> as this may be, it'll bite you hard one day when you have some  
> enormous file that you want to encode as a binary RIB, or when you  
> want to add debug hooks to the API calls themselves.
>    - Write a renderer exporter API (ie: not just for RenderMan, but  
> for any external renderer).  In this case, the API needs to be well  
> thought-out.  Simply providing (for example) an API that gives Yafray  
> everything it currently requires would *NOT* solve any of the  
> important problems for a RenderMan exporter.  Instead, any API should  
> be designed as very generic from the ground up.
> Whew!  There's loads of stuff I haven't mentioned.  The above just  
> outlines the major problems.  Naturally, a large part of the problem  
> of exporting to a RenderMan renderer is very easy and poses no  
> problems at all. :-)
> Jonathan Merritt.
> _______________________________________________
> Bf-committers mailing list
> Bf-committers at projects.blender.org
> http://projects.blender.org/mailman/listinfo/bf-committers
Ton Roosendaal  Blender Foundation ton at blender.org  

More information about the Bf-committers mailing list