[Bf-committers] Cycles shader interaction

Knapp magick.crow at gmail.com
Wed Mar 21 14:35:07 CET 2012


On Wed, Mar 21, 2012 at 2:14 PM, Brecht Van Lommel
<brechtvanlommel at pandora.be> wrote:
> Hi,
>
> On Wed, Mar 21, 2012 at 1:42 PM, Tobias Oelgarte
> <tobias.oelgarte at googlemail.com> wrote:
>> I could imagine two points in the pipeline which could be relatively
>> easy expanded to do some more fancy stuff.
>>
>> The first point would the the transmission to the film. The color (if we
>> might call it that way) of the result of a sample could be adjusted like
>> inside the ramp functions for diffuse and specular inside BI, before
>> combining it with the film itself. That way you could create materials
>> that are seen differently as they are influencing their environment.
>> This is partially possible with the help of the "Light Path" Node, but
>> very limited at the same time.
>>
>> It would allow some kind of postprocessing coupled on the material
>> itself. That way you would not need masking or indexing and a quickly
>> complex getting compositing setup. A simple use case that comes to my
>> mind would be toon shading.
>
> I don't think this would work as you might expect. Each individual
> sample may be very different from the final averaged color, and in
> fact depends on the particular sampling algorithm used. If for example
> we can add an optimization which means we can use a single sample with
> value 0.5, instead of two samples with value 0.0 and 1.0, a
> modification to these values would have quite different results.
>
>> The second point would be the addition of the following inputs. The
>> first input would deliver the angle between the surface and the current
>> light ray (or even better both vectors), the second input the
>> intensity/color of the light from this ray. That way the shader would
>> have a great control on how light affects the surface and vice versa.
>>
>> It would allow unrealistic materials. For example a material that is
>> basically diffuse, but becomes glossy for high light intensities. It
>> could shift it's color based on the angle and so on.
>
> The shader is executed before lighting is evaluated, there is no light
> vector available at that time. For multiple importance sampling, or
> just generally GI bounces, the result of the shader is used to do
> sampling to find a direction to send a ray in, which may or may not
> hit a light after one or more bounces. The idea that there is a light
> vector available works well for direct lighting from lamps, but once
> you get to indirect lighting it doesn't fit anymore.
>
> This is quite different than Renderman and inspired by OSL design
> which can be used successfully in production. I understand this has
> limitations, there's a balance to find here between more user control
> in the shader and a design where global illumination or more
> intelligent sampling algorithm fit naturally without the user having
> to care about them, and I believe the latter is the right choice for
> this type of render engine.
>
> Brecht.

Seems to me he should be using the composting nodes not the texture
nodes to achieve effects like this.

-- 
Douglas E Knapp

Creative Commons Film Group, Helping people make open source movies
with open source software!
http://douglas.bespin.org/CommonsFilmGroup/phpBB3/index.php

Massage in Gelsenkirchen-Buer:
http://douglas.bespin.org/tcm/ztab1.htm
Please link to me and trade links with me!

Open Source Sci-Fi mmoRPG Game project.
http://sf-journey-creations.wikispot.org/Front_Page
http://code.google.com/p/perspectiveproject/


More information about the Bf-committers mailing list