[Bf-committers] Proposal for unifying nodes

Robin Allen roblovski at gmail.com
Wed Jun 17 15:48:51 CEST 2009


2009/6/17 Nathan Vegdahl <cessen at cessen.com>

> >> Again, textures are not data, textures are dynamic functionality.
> >
> > They're equivalent. You can see them as either. Personally I think users
> > would find "textures as things" to be a nicer abstraction.
>
>    The thing is, I really *want* to agree with you on this point.  The
> more I've been thinking about it, the more I've felt like, "Yeah,
> procedurals are basically just 3d, infinite size, infinite resolution
> bitmaps."  It's a really appealing thought.  Really appealing.  And it
> would make things like blur nodes possible.
>
>   But... arg.  Then I think, "But what if I want to plug part of the
> texture coordinates into the scale parameter instead of the y
> parameter?  Or what if I want the <x,y> to be constant, and drive the
> texture entirely by plugging the coords into scale and turbulance?
>

http://img146.imageshack.us/img146/2084/withtexnodesyoucan.png

And why shouldn't the user be able to blur across the scale parameter
> too?"


Huh, I think I finally see what everyone's getting at. My texture nodes do
treat the coordinates differently from the other inputs. My mental model of
"textures" and "texture generators" is a valid interpretation of the
problem, but the nodes force that interpretation on everyone by providing
the functional abstraction for the coordinates and not for the other inputs.

You're right, you can't blur across the scale parameter. I mean, it's better
than what we have now, where you can't blur across anything, but maybe it's
not ideal. I hope you realise I'll have to come up with some hairbrained way
to abstract all the parameters now, which will confuse everybody even more.


>   In treating procedurals as images, you're choosing for the user
> which inputs are "settings" and which are varying parameters.
>   If we were going to do things as you propose, then I'd say *all* of
> the texture's parameters should be treated as part of the texture
> object's parameters, not the node's.  But then the parameters are
> different between texture types, and things get messy.  Plus it's much
> harder for the user to simply set the texture parameters.


Right, it's much harder, *unless* the node still offers them as inputs, but
you can leave them indeterminate in which case they become arguments to the
output type. Like currying (I think it's currying?) in functional languages.
It's possible that we could come up with a decent UI for that, though maybe
not. If not, the non-ideal solution I originally proposed would at least
eliminate tree types and contextual data, and let the user do more to
textures. It's better than what we have now.

Come to think of it, that's probably why I only thought of the coordinates
as the parameters to be abstracted: it's the coordinates which are currently
the "magic contextual data" which need to be removed. Coordinates are the
only parameter we *need* to treat like this in order to unify node trees.

-Rob


More information about the Bf-committers mailing list