[Bf-committers] From Farsthary another anouncement

Yves Poissant ypoissant2 at videotron.ca
Thu Feb 5 01:26:01 CET 2009


Hey Brecht

From: "Brecht Van Lommel" <brecht at blender.org>
Sent: Wednesday, February 04, 2009 10:22 AM


> I understand not all BRDF's can be split into these things. If such
> BRDF's get implemented then you do indeed get into trouble (you could
> categorize such a BRDF as being purely specular for example but then not
> all algorithms will deal well with them). However it is possible to add
> these "unphysical" parts together. That doesn't give you a physically
> correct BRDF but that's not the point, you can treat it as if it was
> one.

I'm mainly concerned about making sure not puting all this effort and end up 
painting the renderer into a corner.

- Some empirical BRDF models can be broken into diffuse, specular and mirror 
components. But most can't. Especially the measured BRDFs that one day you 
will be able to download from the web.

- Path tracing, MLT, Photon Mapping, and such rely on the possibility to get 
random rays from a BRDF (Distribution Function). If the BRDF API (so to 
speak) is designed to sample the 3 different components separately, you make 
it harder to correctly combine the probabilities and more error prone. Also, 
you will have a much harder time integrating non-separable BRDF into such a 
pipeline.

- BRDFs should really be thought as black boxes that do their thing, that is 
compute and provide directional or hemispherical reflectance and/or 
transmittance or direction and weights for sampling rays.

> Also it is useful to path trace only parts of the full BRDF, since
> photon maps only simulate parts, you want to be able to compare if just
> that part is implemented correctly.

Yes. I see what you mean here. However, in this context, my suggestion would 
be to have the BRDF provide such services. After all, the BRDF is the better 
place where to place specifics on how to provide only diffuse part for 
specific applications for instance. Although it is generally mentioned that 
the shooting phase of photon mapping should only process the purely diffuse 
component, my experience is that the practice is not so orthodox. Of course 
specular reflections are harder to process but where and how do you stop 
between purely diffuse and partially diffuse up to somewhat glossy?

That said, really, the simpler and more reliable way to test if the diffuse 
reflections of two renderers are similar is to set all the surfaces in the 
scene to perfectly rough. If you attempt to test that by tweaking the BRDF 
components individually, you will only get yourself into more troubles. It 
is better to just set all BRDF roughness to 100% for the duration of the 
test.

> Many algorithms rely on splitting up the rendering equation and that's
> not going to disappear anytime soon for a lot of applications. For
> example photon mapping is often split up into emit, standard direct
> diffuse and specular with lights, two photon maps for indirect diffuse
> and caustics, and raytracing for indirect specular. All these things
> added together can still give you the complete rendering equation.

I think of Photon Mapping as a bidirectional path tracing. The first phase, 
you trace from lights into the scene using the material BRDFs to bounce rays 
around and store irradiances. The caustics map does the same thing but 
concentrate of transmissive material (BTDF) because it is just more 
efficient to do it that way because caustics require much more photons than 
indirect illumination. The second phase traces from camera into the scene, 
here again using material BRDFs to sample for final gathering or just 
reading the photon map directly for caustics. The raytracing part is not 
inconsistent with all that and can also use the BRDFs for soft reflections.

> Yes, it's not as clean as if you would not split things but in my
> opinion limiting Blender's rendering engine to physically based BRDF's
> also throws a lot of efficiency out of the window with it.

The way I see it, it might be necessary to keep the current render engine 
and build a new one that would live along the current one and be switchable 
by the user. If the new renderer is enabled, then the materials 
specifications would be compatible with the BRDF concept. Otherwise, the 
current set of material parameters would still be available with all the 
current set of shaders. It is possible to translate most BRDF into legacy CG 
parameters but it would not be reliable to try to translate the old legacy 
CG parameters into physically plausible BRDFs. But as anyone who have tried 
to translate Blender scenes to VRay, Indigo, and such knows, that is life in 
the physically plausible renderers world.

Yves 



More information about the Bf-committers mailing list