[Bf-committers] How about adding boost uBLAS librarytoblender/extern?

Yves Poissant ypoissant2 at videotron.ca
Wed Jan 28 04:54:39 CET 2009


From: "Brecht Van Lommel" <brecht at blender.org>
Sent: Tuesday, January 27, 2009 9:22 PM


> On Tue, 2009-01-27 at 19:16 -0500, Yves Poissant wrote:
>> At work, I implemented a production ray-tracer that can render complex
>> scenes including indirect illumination in less than 10 seconds per 
>> 800x450
>> 5x-AA frame on a single core and not even using SSE, the GPU or even 
>> Boost
>> (or Eigen).
>
> I'm a bit skeptical about what you mean by 'complex' here :). Complex
> like architectural scenes with a few 100k polygons or 25 million
> polygons as in a BBB scene? Nevermind the complexity of a 4k picture
> with displacement mapping everywhere.. . The number of operations scales
> logarithmically but the memory access gets slower too, and it's even
> worse when the scene does not fit in main memory.

The scenes I render are interior designs with between 20k poly to 500k 
polys. The rooms are fully furnished and all the furniture geometry is there 
down to all their internal construction pieces. I'm waiting for when one of 
our customer to deploy the web service for which our renderer is to be used 
so I can give a URL for you to see. We released in december so it won't be 
for a while though.

I agree that 25M polys is huge. Some people do render such huge scene with 
ray-tracers. But personally, I believe that rendering high poly scenes that 
is mainly composed of tiny polys such as for hair and vegetation should be 
handled like Pixar does with their "Stochastic Simplification of Aggregate 
Detail". Hair and high particle counts have always been a problem and I'm 
convinced that they should be handled in a probabilistic way rather than in 
an individualistic way.

Basically, I'm far from against using all the approximation, stochastic et 
probabilistic tricks available. In fact, the reason why I can render my 
scenes in less than 10s is exactly because I use several approximation 
tricks. I don't pretend I do physically accurate renders. I don't even 
pretend I do GI but I do indirect illumination and it is pretty good (and it 
is not screen space). But one thing I don't do is z-buffer shading. And my 
renderer is sitting on physically plausible material and light descriptions. 
This alone and the generalization of a BRDF concept immensely simplified the 
whole shading code. Now, I have a good basis for improving, further 
optimizing and adding more advanced and physically plausible shading 
processing. And because I use the BRDF concept, I know that whatever the 
shading algorithm I trow at them, I will always get consistent renders from 
one technique to another. And I know that I can evolve my renderer without 
fearing that the material descriptions would eventually fail to follow.

Yves 



More information about the Bf-committers mailing list