[Bf-committers] Shading System Proposals
ypoissant2 at videotron.ca
Wed Dec 16 03:19:24 CET 2009
From: "Brecht Van Lommel" <brecht at blender.org>
> To get some discussion started, here's two things I'm still unsure
> about, extracting a BXDF from node trees and the future role of the
> texture stack.
Sorry to come so late in the discussion. Reading all the posts, here and the
project wiki pages as well as several pages of the Houdini documentation
takes time. And I found myself having difficulties getting on the same page
as the discussion. In retrospect and as I form a better mental model of the
issues at stake here, I think my difficulties come from different
assumptions behing some terms and concepts.
But I want to participate in the discussion so here is a start of
> Matt proposes a Sample_F output here but it's not entirely clear to me
> how this works, I also couldn't understand from the Houdini
> documentation for example if their BSDF F value is just a
> vector/scalar value ...
>From what I read in the Houdini documentation, the VEX node system related
to shading, even PBR, is pretty basic and I would even say old-school. To
answer the above question, the Houdini documentation page "Understanding
Mantra rendering", there are two snippets of code that reads as
F = texture(map) * diffuse() + 0.5 * phong(20);
F = diffuse() * (1.0, 0, 0)
That definitely looks like a standard legacy CG shading function. And it is
interesting to read their use of "BSDF" in this context. Anyway, it is clear
that F type is a RGB vector.
> ... or also contains information on how to sample it.
I don't know of any BxDF representation that could provide an external
function some information on how to sample it. At least not in any efficient
manner. This is something I've been thinking about for quite a while and I
think that this cannot be done. The best a BxDF can provide to help in this
regard, is a warping distribution function that can map a 2D sample point to
a direction. Given the state of knowledge about BxDF, that is currently as
far as we can go. And this can be implemented in so many ways that it is not
That said, I may misunderstand the sentence because it looks like a mix of
concepts. From an implementation POV, we can view a BxDF like a black box
that takes some inputs and provides some output. That would fit with the
idea of a BxDF as a node. Some of the services a BxDF can provide are:
1) Given a hit point, an incident vector and an excitant vector, provide a
2) Given a hit point, an incident vector and a uv point, provide the
direction (from a uv to direction maping), the probability of this direction
and the corresponding reflectance.
I will not go into much more details about that here right now because it
needs to be elaborated much more but the idea is that an implementation of a
BxDF needs to provide a few services and that those services are all related
although that may be called in different circumstances from different
shading pipelines. Those services are required by a practical implementation
but are usually not discussed in BxDF literature and I think this is those
services that are of interest here. Am I right?
> I can think of a few ways to do this using a node tree, doing the
> computation F by evaluating nodes with some entirely or partially
This is a place in the discussion where I have problems following. Can you
provide usecases where nodes are entirely excluded and where some are
> and doing sample distribution by picking randomly from
> distributions provided by nodes that contain a BXDF.
"Nodes that contain a BxDF" - I believe you see the BxDF as a data type
here. Is that right?
If we are dealing with multiple distributions and want to combine them, then
a way of doing that is through a russian roulette.
> possibility would be to pass a long a BXDF type through nodes, then
> for example a mix node could create a new BXDF that can do more clever
> importance sampling.
Mixing BxDF can be done in different ways depending on the service provided.
Mixing the F value requires a simple vector mixing but mixing the
distributions require a russian roulette. This is all doable. But I can't
see a way, from those mixes to do more clever importance sampling though.
> Neither seems particularly elegant or transparent
> to the user to me, but I can't think of good alternatives at the
What is the problem that you are trying to solve with this BxDF mixing here?
> The texture stack is obviously limited in what it can texture. Nodes
> provide a solution here, but the texture stack is still convenient. So
> what is the role of the texture stack, do we try to fix it, and if so
> do we make it more flexible like allowing to texture nearly
> everything, or maybe we just leave it as is mostly and do a few small
This is a part that is too implementation related for me to really
understand and help. But as a general rule, I would say that textures should
be usable for anything. Not only the traditional shader channels but any
user defined channels too. For instance, there is this group, (the name
McCool comes to mind) that designed, a few years ago, a way to separate BRDF
parameters into a set of bitmaps and drive BRDFs on the GPU. Those bitmaps
are only usable if interpreted as BRDF parameters. Another usage is if we
want to have spatialy changing BxDFs. There already exist spatially varying
BRDF around. This is nice for decorative object with mixed painted materials
or tissues with mixed threads types of any mixed type surfaces.
More information about the Bf-committers