[Bf-committers] Shading System Proposals

Brecht Van Lommel brecht at blender.org
Tue Dec 1 19:13:33 CET 2009


Hi,

On Tue, Dec 1, 2009 at 5:55 PM, Raul Fernandez Hernandez
<raulf at info.upr.edu.cu> wrote:
>  +1 for me, my early experiments in unbiasing Render Internal with
> pathtracing, and also Final gathering clearly shows the limitations of
> the current shading system and prevent advaced rendering techniques and
> scalability without a bunch of hack.
>  With the new proposal everything will be relatively trivial to implement.
>
>  The PBRT book and the Luxrender project are excelent references also
> along with mantra.
>  I have an especial question: I haven't see in the proposals the Sampler
> structure, wich should be the heart of the new pippeline, it provides
> the foundation of every aspect of it, from BxDFs (even for evaluating
> multilayered/compound BxDFs ), Integrators, ligths, texturing, camera,
> etc.
>
>  Having the sampler independent from those components will allow the use
> of efficient/smart sampling strategies in the whole Render internal.

I think this is a fair point, though a sampler won't necessarily be
used for all algorithms, so I don't see it as the foundation. The way
I see it is that interfaces like bxdf, camera and light have an eval()
and sample() function. Stupid algorithms only call eval() or locally
create a sampler and call sample(). So we do indeed need to design
what the Sampler interface looks like for the sample() function, but I
prefer to not spend much time on this and keep the sampler stupid, and
leave it to someone who wants to implement a better integrator.

>  Also as the render gain in complex features and phisically based ones,
> render times could increase and the cost of tweaks do the same for that
> reason I think the system should support flexible Deep Render Buffer
> (output and store not just RGBA values but any data structures/values)
> that way, for example, storing for each pixel the whole ray tree with
> surface normal/color values will allow at postprocessing interactivelly
> change the material/texture properties without rerender the scene.

I'm not sure this is something that we should try to do. Personally I
think that OpenCL/GPU acceleration is a better bet, even though I
don't expect that will happen anytime soon for Blender. From what I've
seen, this kind of re-rendering brings with it quite some complexity
and limitations, and we might better skip that. Perhaps I've missed a
simple way to do it, but all approaches I've seen are either
complicated and limited to certain effects, and usually both :).

Brecht.


More information about the Bf-committers mailing list