[Bf-committers] How about adding boost uBLAS librarytoblender/extern?

Benjamin Tolputt btolputt at bigpond.net.au
Tue Jan 27 03:13:09 CET 2009

Yves Poissant wrote:
> <preambule>
> *snip history*
> In 2009, things are changing. Single core computers are out of the way and 
> multicore computer are in. It is now possible to do realtime or very near 
> realtime GI of very good quality.
> </preambule>
> Legacy renderers are outdated and need to be replaced by more modern 
> renderers. It is impossible to get good surface renders with legacy material 
> properties and single sample illumination. It takes considerable time to 
> tweak legacy material properties to get a realistic render and it takes even 
> more time to do that for animation because legacy material property tweaking 
> are view dependent. The modern way to describe materials are based on the 
> BRDF concept. The shading algorithm needs to be totaly GI integrated. GI 
> meaning Global Illumination, it means that the whole rendering equation 
> should be integrated in one renderer instead of adding legacy CG trick 
> results. New renderers such as LuxRender, Indigo, YafAray, Maxwell, 
> FryRender, etc, and to some extent VRay and such are all based on those new 
> and physically plausible (if not physically accurate) description of 
> materials and light. What is more annoying is that it is impossible to map 
> legacy material properties to physically plausible material properties

While I generally stay out of discussions like this - I think this is
something I can talk about without sounding like a complete dullard.

Simply put, while I like GI algorithms and new shading
frameworks/concepts the "legacy systems (as defined in you preamble) are
not as legacy & outdated as you make out. Renderman & it's clones are
all based around these "legacy" concept and are pretty much the standard
against which GI frameworks / algorithms are tested... and then
generally integrated into in some fashion.

That is also ignoring the "old school" (read: tried & tested) pipeline
where various features/properties made accessible via the scanline
rendering frameworks are used in post-processing in order to make the
resulting images more appealing to the client / viewer. It's nice to
talk about the next generation of rendering algorithms and how Blender
can be a part of said revolution, but replacing the scanline / shaders
approach is (in my opinion) madness.

I've many an axe to grind on some Blender decisions (most of which are
actually being addressed in the coming 2.5 revision), but focusing their
rendering architecture on the scanline / shader concept (like most
professional pipelines I have used) is the way to go.


Benjamin Tolputt
Analyst Programmer

This email and any files transmitted with it are confidential to the
intended recipient and may be privileged. If you have received this
email inadvertently or you are not the intended recipient, you may not
disseminate, distribute, copy or in any way rely on it. Further, you
should notify the sender immediately and delete the email from your
computer. Whilst we have taken precautions to alert us to the presence
of computer viruses, we cannot guarantee that this email and any files
transmitted with it are free from such viruses.

More information about the Bf-committers mailing list