[Bf-committers] Proposal for unifying nodes

Robin Allen roblovski at gmail.com
Tue Jun 16 19:40:22 CEST 2009


Hi Brecht, some great points. Let me answer them one by one...

Brecht Van Lommel wrote:
> First is the case where the node tree is not an actual tree but a graph.
> If a node output is used by two or more nodes, it would be good to not
> execute that node twice. As I understand it, the texture nodes
> implementation currently executes it twice, but I may be wrong here. It
> is of course fixable (though with functions the result of that first
> node are not necessarily the same each time, so need to do it only when
> possible).

You're right, this is a missed optimization opportunity in the current
texture nodes.  In the new system, though, with functions or without, the
output of a node can only change when its inputs change, so it will be
perfectly possible to cache intermediate results in the evaluation context
(the NodeStack in current terminology)


> Tree types is something you can't avoid in my opinion. To me the purpose
> of unification would be to share nodes between tree types, not to allow
> all nodes in all tree types.
>
> A shader Geometry or Lighting node makes no sense without a shading
> context, nor can you place a modifier node in a shader or particle node
> tree. This kind of restriction you have to deal with in any design.

Look for Nathan Vegdahl's reply to this; he is at least partially right.

I wouldn't say that a shader Geometry or Lighting node "makes no sense
without a shading context", if by "shading context" you mean a special
property of the tree or the context in which the tree is being evaluated.

If by "shading context" you simply mean geometry and normal data, and all
the data contained within a ShadeInput struct, then you're right, those
nodes don't make much sense without that data, but there's no reason
you can't provide that data from some other nodes in your generic,
usage-agnostic node tree.

This is a major point of my system, so let me just put that another way:

The shader Geometry and Lighting nodes will output Shader types, and they
don't need any context to do so. Evaluating those shader types, however,
requires data which you might call a "context". The shader object should
not care where this data comes from. If you've selected your tree in
the Material panel and Blender is using it to render a scene, then Blender
is receiving the Shader object and evaluating it. But you could also be
"manually" evaluating it with other nodes and using the results in some
creative way.

(Note that the Shader object is a function in the same way a Texture is.)


> Being placed before the thing it modifies I guess is a matter of
> terminology, but let me be more clear on what I mean by differences in
> ordering. As I understand the texture nodes code, currently the texture
> manipulation runs in reverse order compared to shading nodes (while
> color manipulation runs in the same order). Example that at first sight
> seems to give the same result, but is actually different:
>
> shading nodes: geom orco -> rotate X -> rotate Y -> texture voronoi
> evalualted as: voronoi(rY(rX(orco)))
>
> texture nodes: voronoi -> rotate X -> rotate Y (mapped to orco)
> evaluated as: voronoi(rX(rY(orco)))
>
> The shading nodes example may not be a great one because you want to add
> ShaderCallData there too, but consider for example two rotate nodes
> manipulating a vector which is then input in a modifier node.

> The order of texture nodes can however be reversed, if you go over the
> nodes in two passes.

If I understand you correctly (and I'm not sure I do, so please correct me
if this makes no sense), your examples are misleading. In your shading nodes

example, when you say:

voronoi(rY(rX(orco))

... you're showing how the nodes are chained together, whereas in your
texture nodes example:

voronoi(rX(rY(orco))

... you're not showing how the nodes are connected, but showing how the
returned texture objects (tex_delegates) call each other.

You should think (or, I should say, the user should think) of texture
nodes as operating on textures, not on colors, in much the same way that
compositor nodes operate on bitmaps, not individual pixels. The fact that
textures are implemented using functions needn't be apparent to the user.


> OK, implicit conversions go a long way to unifying such nodes, I agree.
>
> How would you drive for example specularity with a texture in the
> shading nodes (or velocity in particle nodes)? Can you link up those two
> directly, using perhaps orco texture coordinates by default? Or do you
> add a node inbetween which takes that texture function + coordinate to
> do the conversion?

This is I think an appropriate juncture for mspaint:

http://img190.imageshack.us/img190/859/specshade.png

Note that it would be entirely possible to define an implicit conversion
from
TexCallData to ShadeCallData using, as you suggest, orco by default, and
then
the user would be able to connect "Texture" directly to "Amount". This may
be a good idea.


> Another thing that is not clear to me is ShaderCallData. With texture
> nodes you're passing a function which takes a coordinate as input, what
> does the shader function take as input?

ShaderCallData is I believe already defined in the current shader nodes.
It's
basically a ShadeInput* with some extra stuff.

> It doesn't make much sense to
> rotate a shader result I guess, what does that rotate, the point,
> normal, tangent?

Nope, doesn't make sense at all. You can only rotate a texture because it's
a
function of a vector and you can rotate a vector.

> So you don't pass along shader functions, and it stays
> basically the same?

I'm afraid you've lost me.


> I'm not proposing to keep the nodes separated per tree type, only a
> subset of nodes would be tied to tree types, Math nodes would not be one
> of those.

As I've said, I think it's very important that we do away with tree types.
The user should not have to make a decision about where his tree will be
used while he or she is building it.


> Anyways, I can see that it would be cool to mix texture nodes with
> modifier nodes for displacement for example, and that this is only
> possible when passing functions. I'm just not sure I like all the
> consequences.

I hope I've allayed at least some of your fears :)

-Rob


More information about the Bf-committers mailing list