[Bf-committers] From Farsthary another anouncement

joe joeedh at gmail.com
Tue Feb 10 08:10:38 CET 2009


On Mon, Feb 9, 2009 at 5:43 PM, Yves Poissant <ypoissant2 at videotron.ca> wrote:
> From: "joe" <joeedh at gmail.com>
> Sent: Friday, February 06, 2009 3:05 PM
>
>
>> Except I thought those renderers don't allow custom shaders,
>
> They do. But they would not be called shaders I guess. Rather, they would be
> called custom BRDF.
>
>> while we
>> obviously have access to the legacy shading code in blender.  It seems
>> if you hacked together BRDF black boxes like brecht said, you'd at
>> least get a better result then the sort of conversions done for those
>> renderers.
>
> Of course, you can work on doing that blackboxing of legacy shaders into
> BRDF blackboxes. For some shaders, the way of doing that is already known
> and documented in papers. Some are actually quite trivial. For other,
> someone would have to hand derive the distribution functions for numerical
> integration and the sample distribution warping function for importance
> sampling. The mathematical technique for doing that is also documented in a
> Graphics GEMS (I don't have them with me so I cannot point to which specific
> one but there is the world "warping" in the article title). There are also
> other techniques. IMO, it is not worth the time to do that because they will
> alll give more or less the same material look anyway but one could do that.
> If someone can do that succesfully, then, indeed, you wouldn't need to
> convert the shaders parameters to BRDF equivalents.

This answers a question in another email I just sent, cool (should've
read this one first).

>
>> Ah so your saying a shading language would be used to define BRDF's?
>
> That is quite possible if the shading language API provides all the hooks
> and power for doing that.
>
>> What sort of constraints would it have?
>
> A BRDF must provide much more services than a shader. A Shader basically
> receives a couple vectors and returns a color. A BRDF must be able to do
> that too. But it must also be able to provide the services oulined by Raul
> in his latest post here under the heading "The main functions are". Those
> other services are the ones that are going to be hard to implement in the
> black box wrappers discussed above.
>
>> I'm just really curious, since
>> I think a shading language would be an important part of a redesigned
>> shading pipeline (at the very least, nodes would compile to it).
>
> The "shading pipeline" is a first generation renderer paradigm. If you try
> to fit a physically based renderer into that strict paradigm, you will
> quickly bump into problems.

The problem is you need a fairly flexible system.  There has to be
some way to do everything from physically-correct shaders to shaders
that ignore incoming light entirely.

It seems to me that you need flexibility in a shading language, the
ability to go outside the physical and mathematical bounds of a BRDF.
People don't always want a physically-plausible result.   So I'm
thinking there could be two different types of shaders: BRDF's, and a
more flexible kind.  The more flexible kind won't be meant for GI at
all (which after all is meant to produce physically-correct scenes);
we could try to hack the flexible kind to work (even if it sucked)
into GI, or we could not allow using those shaders in it (and better
research in how to do that eventually comes out).

For that matter, it may be a good idea to support a
non-physically-based GI algorithm in addition to the physical one,
perhaps just simple color bleeding.  This probably could even be done
in a way to always get smooth results (like approximate AO), which
would be useful for animations (I bet pixar's GI is like this, I
vaguely remember their paper wasn't the most physically-correct
algorithm in the world, should look it up again).

Often times, artists prefer solutions that aren't as physically
correct, but are easier to use, for various reasons (speed, smoothness
of result, more tweak ability, better matches a cartoony look, etc).
Big Buck Bunny, for example, used approximate AO (I can't remember the
non-blender term for this for the life of me, it was in gpu gems 1,
one of the sample chapters I think), because it produced a smooth
result, necessary for animation.

Joe


More information about the Bf-committers mailing list