[Bf-committers] From Farsthary another anouncement
ypoissant2 at videotron.ca
Wed Feb 11 02:27:39 CET 2009
From: "joe" <joeedh at gmail.com>
Sent: Tuesday, February 10, 2009 2:10 AM
>> The "shading pipeline" is a first generation renderer paradigm. If you
>> to fit a physically based renderer into that strict paradigm, you will
>> quickly bump into problems.
> The problem is you need a fairly flexible system. There has to be
> some way to do everything from physically-correct shaders to shaders
> that ignore incoming light entirely.
Personally I don't think trying to fit every rendering approaches into one
single model will work. It will require a lot of work and in the end would
be too clumsy and confusing. Flexible, yes. But not to the point that it
include everything and is a mess to get around with.
> It seems to me that you need flexibility in a shading language, the
> ability to go outside the physical and mathematical bounds of a BRDF.
> People don't always want a physically-plausible result. So I'm
> thinking there could be two different types of shaders: BRDF's, and a
> more flexible kind. The more flexible kind won't be meant for GI at
> all (which after all is meant to produce physically-correct scenes);
> we could try to hack the flexible kind to work (even if it sucked)
> into GI, or we could not allow using those shaders in it (and better
> research in how to do that eventually comes out).
That is why I think a better approach and easier to design and code, is to
have different rendering systems. For instance, I don't see how one could
possibly put a shader designed for toon shading into a physically plausible
renderer and expect good results. So why give the end user the possibility
of doing that? Said otherwise, why give the end-user the possibility to make
something fundamentally wicked and let him/her deal with trying to figure
why it whent so bad?
> For that matter, it may be a good idea to support a
> non-physically-based GI algorithm in addition to the physical one,
> perhaps just simple color bleeding.
Yup. That is what I think too. IMO, the solution is to have like 3 rendering
*environments*: A legacy render environment, a toon (or NPR) render
environment and a physical render environment. When the end user works in
the physical render environment, he can set its lights and material using
physical descriptions. When working in the NPR environment, then all NPR
lighting and material features are available, etc. This would make selecting
material much simples and much less confusing and would prevent mixing
setups that are not designed to work properly together anyway.
That said, there are already several people working on render engines of all
kind. From NPR to OpenGL emulation to physically plausible to unbiased
renderers. So why replicate all this effort in Blender. In the end, if
Blender had the complete API to allow plugin those external renderers, then
development efforts could be spent on other aspect for which there is little
outside efforts such as the whole animation support system.
> This probably could even be done
> in a way to always get smooth results (like approximate AO), which
> would be useful for animations
The way I see it, there will always be people wanting to experiment with
rendering engines. The lagacy renderer provides just that: A lot a knobs
that can be tweaked and enough rope to hang oneself.
> Often times, artists prefer solutions that aren't as physically
> correct, but are easier to use, for various reasons (speed, smoothness
> of result, more tweak ability, better matches a cartoony look, etc).
> Big Buck Bunny, for example, used approximate AO (I can't remember the
> non-blender term for this for the life of me, it was in gpu gems 1,
> one of the sample chapters I think), because it produced a smooth
> result, necessary for animation.
There's always been a tradeoff between artist spending time setting up
materials and lights (in addition to more fun stuff like animating or posing
or creatively placing lights in a scene), and render time.
A lot of users just want a render. Period. They are not at a point where
they can appreciate realisticness and its subtleties. They would like to
render a full movie in an evening. As long as it moves and there are lights
and shadows, then all is fine.
As users get more experienced, they want good looking lights and materials.
And this requires a lot of time spent at setting the myriad of lights and
setting the materials. And once this is done, continuously returning to it,
tweaking the materials to correct idiosyncratic behavior when the lighting
environment changes. Unfortunately, this can suck a lot of time. Time that
would be better spent on truely creative stuff like posing the characters,
choosing dramatic light placements, finding proper camera POV, animating,
facial expressions, etc.
We are getting to a point where CPU time is cheaper than human time for that
sort of chores. Instead of spending days at fine tuning the multitude of
parameters for materials. just use some physically designed presets, tweak a
little according to some well defined and easy to understand physicall rules
and let the computer spend the time rendering that stuff. And then, while
the computer renders a sequence, the artist can enjoy tweaking the
subtleties of a character expression, gesture and timing.
Personally, I find the artistic race for photorealism graal a little bit
futile because at some point, when all models and renders will look
photoreal, what's left for distinguishing an artist style from another
artist style? And frankly, I'm kind of tired of seeing arch-vis after
arch-vis of photoreal interior designs using VRay and like. It really gets
boring. What I really appreciate, though, are artists who have developped a
very personal style.
But there is no denying that photorealism is everywhere and unavoidable.
There is no denying that the push toward photorealism will continue and that
every 3D application out there ought to support that rendering paradigm.
I don't believe, though, that physically based material definition goes
against the grain of artistic style development. I do not equate physically
based materials and lights with photorealism. They just don't equate.
Physically based material definition already allows for a lot of freedom and
creativity in designing new materials. But the style is mainly in the
design. The character designs, their environment design, the situation they
have to deal with, their reactions, etc.
Physically based material definition makes the artist life easier because
the material responds to well known physical laws. We already know,
intuitively, what to expect from this and that material. Our problems, right
now is that the legacy CG material model is completely had-hoc and the
artist have to learn a new, unintuitive, mental model of its behavior. A
mental model that may breaks down as soon as several features are brought
I whent a little too deeply technical into this BRDF vs legacy discussion.
I'm not even sure there are plans to refactor the renderer. I mainly liked
to point the developers attention to the field of physically based
rendering. It is an important development going on and should not be
overlooked. And even if there are no plans (or nobody available) for
refactoring the renderer, it may be a good idea to be aware of this field
and keep that in mind when implementing new features that are related to
material and lights.
More information about the Bf-committers