[Bf-committers] From Farsthary another anouncement
Brecht Van Lommel
brecht at blender.org
Thu Feb 5 08:03:09 CET 2009
I agree with most that has been said here, also regarding making
different material types for easy of use, I'll just reply to the
things I disagree with :).
Yves Poissant wrote:
>> I understand not all BRDF's can be split into these things. If such
>> BRDF's get implemented then you do indeed get into trouble (you could
>> categorize such a BRDF as being purely specular for example but then not
>> all algorithms will deal well with them). However it is possible to add
>> these "unphysical" parts together. That doesn't give you a physically
>> correct BRDF but that's not the point, you can treat it as if it was
> I'm mainly concerned about making sure not puting all this effort and end up
> painting the renderer into a corner.
> - Some empirical BRDF models can be broken into diffuse, specular and mirror
> components. But most can't. Especially the measured BRDFs that one day you
> will be able to download from the web.
> - Path tracing, MLT, Photon Mapping, and such rely on the possibility to get
> random rays from a BRDF (Distribution Function). If the BRDF API (so to
> speak) is designed to sample the 3 different components separately, you make
> it harder to correctly combine the probabilities and more error prone. Also,
> you will have a much harder time integrating non-separable BRDF into such a
> - BRDFs should really be thought as black boxes that do their thing, that is
> compute and provide directional or hemispherical reflectance and/or
> transmittance or direction and weights for sampling rays.
Mainly my point here was that if you are implementing photon mapping
or something similar in Blender with the existing shaders, it is easy
to make a black box out of them for path tracing. If you don't bother
with things like importance sampling, to get the full BRDF value it is
really just a matter of adding the results from the individual ones
together, as far as I understand. I'm not saying though that in a
physically based renderer this will give great results, but it should
be the same as when you use those functions in another algorithm if
that algorithm is implemented correctly, so it's a useful sanity check.
If you're implementing algorithms like MLT than I agree it would be
good to look at adding physical BRDF's along with it.
>> Yes, it's not as clean as if you would not split things but in my
>> opinion limiting Blender's rendering engine to physically based BRDF's
>> also throws a lot of efficiency out of the window with it.
> The way I see it, it might be necessary to keep the current render engine
> and build a new one that would live along the current one and be switchable
> by the user. If the new renderer is enabled, then the materials
> specifications would be compatible with the BRDF concept. Otherwise, the
> current set of material parameters would still be available with all the
> current set of shaders. It is possible to translate most BRDF into legacy CG
> parameters but it would not be reliable to try to translate the old legacy
> CG parameters into physically plausible BRDFs. But as anyone who have tried
> to translate Blender scenes to VRay, Indigo, and such knows, that is life in
> the physically plausible renderers world.
Heh, the thing I thought may be difficult is turning the physical ones
into legacy ones, so they would still give reasonable results when
using algorithms that rely on splitting things up (for example
irradiance caching, SSS, baking GI for games, .. ). I have never tried
this though so I wouldn't know.
The other way around seem quite doable to me _if_ you don't expect
those things to give physically correct results, which I think is
Further, I think that BRDF's and cousins are only a part of the full
render engine so I would not call this a "new render engine". I think
the existing ones in Blender could be put into black boxes without too
much trouble, after which the system can be extended.
More information about the Bf-committers