[Bf-committers] Proposal for unifying nodes

Brecht Van Lommel brecht at blender.org
Tue Jun 16 20:36:39 CEST 2009

Hi Robin,

On Tue, 2009-06-16 at 18:40 +0100, Robin Allen wrote:
> > Tree types is something you can't avoid in my opinion. To me the purpose
> > of unification would be to share nodes between tree types, not to allow
> > all nodes in all tree types.
> >
> > A shader Geometry or Lighting node makes no sense without a shading
> > context, nor can you place a modifier node in a shader or particle node
> > tree. This kind of restriction you have to deal with in any design.
> Look for Nathan Vegdahl's reply to this; he is at least partially right.
> I wouldn't say that a shader Geometry or Lighting node "makes no sense
> without a shading context", if by "shading context" you mean a special
> property of the tree or the context in which the tree is being evaluated.
> If by "shading context" you simply mean geometry and normal data, and
> all
> the data contained within a ShadeInput struct, then you're right,
> those
> nodes don't make much sense without that data, but there's no reason
> you can't provide that data from some other nodes in your generic,
> usage-agnostic node tree.

OK, my point is that these nodes themselves can only be used for shader
or texture nodes. So somehow there is a restriction on where you can use
a Geometry and Lighting node. You could still allow them everywhere and
just disable them if they make no sense in that context, as long as you
communicate the restriction to the user, I don't have a problem with

> This is a major point of my system, so let me just put that another way:
> The shader Geometry and Lighting nodes will output Shader types, and they
> don't need any context to do so. Evaluating those shader types, however,
> requires data which you might call a "context". The shader object should
> not care where this data comes from. If you've selected your tree in
> the Material panel and Blender is using it to render a scene, then Blender
> is receiving the Shader object and evaluating it. But you could also be
> "manually" evaluating it with other nodes and using the results in some
> creative way.
> (Note that the Shader object is a function in the same way a Texture is.)

OK, I think I understand the idea the concept of a Shader object now.
Still I prefer to just use materials for that. The fact that you can
create a shader object in the middle of any node tree, instead of
having to make a material, that's neat, but it's really such a corner
use case that I don't think it's worth it.

> If I understand you correctly (and I'm not sure I do, so please correct me
> if this makes no sense), your examples are misleading. In your shading nodes
> example, when you say:
> voronoi(rY(rX(orco))
> ... you're showing how the nodes are chained together, whereas in your
> texture nodes example:
> voronoi(rX(rY(orco))
> ... you're not showing how the nodes are connected, but showing how the
> returned texture objects (tex_delegates) call each other.
> You should think (or, I should say, the user should think) of texture
> nodes as operating on textures, not on colors, in much the same way that
> compositor nodes operate on bitmaps, not individual pixels. The fact that
> textures are implemented using functions needn't be apparent to the user.

I tried to show that the texture coordinates will be transformed in a
different order in these two setups, which may not be what you expect
since they are the same order in the node tree. Of course they are
manipulating different data types, which is why the result is different.

But that's exactly what I think is confusing, the fact that you have to
think about these different data types, that sometimes the node tree
works in a "functional" way and sometimes not. It's unified in that you
can link up all nodes directly, but it's more complicated because you
have think about colors, vectors and functions. It's quite powerful, but
I just prefer it to be simpler even if that limits things a bit.

The only limitation would be that you have to create a texture/material
to do certain things, instead of being able to have everything in a
single node tree.

Further, I don't think users will really understand this distinction
well, and that it will cause more confusion than open possibilities. It
took me a while to understand, and I'm a developer, though admittedly
not one that naturally thinks in terms of function programming.


More information about the Bf-committers mailing list