[Bf-committers] From Farsthary another anouncement

Brecht Van Lommel brecht at blender.org
Wed Feb 4 16:22:41 CET 2009


Hi Yves,

On Wed, 2009-02-04 at 09:07 -0500, Yves Poissant wrote:
> That would be very difficult requirement to comply to and if done, would go 
> against the fundamental assumptions of the rendering equation.
> 
> The mentioned "rendering equation" is not of the typical legacy form
> I = Ia*ka + Id*kd + Is*ks + Ir*kr + It*kt
> that we see in several old textbooks. It is the Kajiya rendering equation 
> which is the integral of the BRDF over the hemisphere above the surface 
> normal. The diffuse and specular components are visual characteristics that 
> are the intrinsic result of evaluating the BRDF in the rendering equation. 
> While some BRDF models, such as the already old Schlick BRDF, do separate 
> the diffuse from specular from reflections components, this is done inside 
> the BRDF black box and those components are not available outside the BRDF 
> black box. Most BRDF models do not make any distinctions between those 
> different visual characteristics of reflections that we call diffuse, 
> specular reflectivity and ambience. A BRDF is a just pattern of reflections 
> distribution over the hemisphere. Human may interpret the resulting surface 
> render as more or less diffuse, more or less specular, etc but there is 
> nothing in the BRDF that explicitly controls each of those visual 
> characteristics the way we do with coefficient sliders found in legacy 
> material specifications.

> The two requirements:
> a) Implement a simple but physically correct path tracer.
> b) Separate the diffuse specular, ambience, reflectivity, transmissivity
> 
> Seems contradictory and impossible to meet to me.

I understand not all BRDF's can be split into these things. If such
BRDF's get implemented then you do indeed get into trouble (you could
categorize such a BRDF as being purely specular for example but then not
all algorithms will deal well with them). However it is possible to add
these "unphysical" parts together. That doesn't give you a physically
correct BRDF but that's not the point, you can treat it as if it was
one.

Also it is useful to path trace only parts of the full BRDF, since
photon maps only simulate parts, you want to be able to compare if just
that part is implemented correctly.

Many algorithms rely on splitting up the rendering equation and that's
not going to disappear anytime soon for a lot of applications. For
example photon mapping is often split up into emit, standard direct
diffuse and specular with lights, two photon maps for indirect diffuse
and caustics, and raytracing for indirect specular. All these things
added together can still give you the complete rendering equation.

Yes, it's not as clean as if you would not split things but in my
opinion limiting Blender's rendering engine to physically based BRDF's
also throws a lot of efficiency out of the window with it.

Brecht.




More information about the Bf-committers mailing list