[Bf-committers] Proposal for unifying nodes

Aurel W. aurel.w at gmail.com
Mon Jun 15 09:32:54 CEST 2009


Hi Rob,

but if you start to pass R,R->R functions in texture nodes, you start
to get this problems as described (no convolution etc.)? Since a
texture node should be able to do Image->Image operations, not
[Pixel]->Pixel. The way I understood this is, that you want to get one
R,R->Pixel function for the entire node tree in the end so you can
evaluate the texture on any coordinate.

For e.g. with current f-curve modifiers, there is the same problem,
the datatype processed there is a float, not an array of float
samples. So just make sure, that the 'biggest' datatype in Texture
Nodes (and other Nodes) always remains the entire 'thing' you want to
evaluate. For compositing, these are Images, not Pixels, for Textures,
these are Images too,....

So for procedural Nodes, which basicly define a function, which
generates a signal, infinite in one or two dimensions (procedural
texture like clouds, voronoi,...), this has to be evaluated at some
point. If such a node outputs to a node, which would need to operate
on Images, not on single Pixels, this node would need to evaluate the
function passed to it, for the entire Image. Wouldn't it be easier, to
do evaluation in the first node and just pass on an image? How you
want to do unification, if for e.g. a Filter Node sometimes gets a
function, sometimes an evaluated Image as input?

But, if you represent all input and output data just as a function in
all nodes of a tree an then start to evaluate at the output node Pixel
by Pixel, this would be possible, but you end up with potential
complexity and just a few nodes would take too much time to compute.
For a filter node, which uses a 3*3 kernel, it would need to evaluate
the input texture 9 times. If you use three filter nodes in a path,
you would need to evaluate the input texture 729 times. Sounds slow to
me.

Aurel

2009/6/15 Robin Allen <roblovski at gmail.com>:
> Hi Aurel, thanks for your comments.
>
>> Pretty much detailed ones already, but I would rather start with more
>> basic design. (It would make sense to think of designing a stream
>> oriented programming language from the ground up)
>
>> I would rather stick to a data stream model, functions as data type
>> doesn't seem to be a good idea.
>
> The reason I'm thinking functions, at least for textures and shaders,
> is that both of these -- well, they pretty much *are* functions. In a
> texture, you pass in the same coordinates, you get out the same color,
> and you can pass in whatever coordinates you want. That fits the
> mathematical definition of a function perfectly. Similarly, for
> shaders, you pass in the same ShaderInfo, you get out the same color.
> There is a phrase in multithreaded programming that a problem is
> 'embarassingly parallel', i.e. it fits the description of a
> multithreaded task so well that you'd be a fool not to use threads. It
> seems to me that manipulating textures and shaders is embarassingly
> functional.
>
> Now, note that I'm not suggesting that the new node system dictate
> that all data types be functions, just that that would be the best
> representation of textures and shaders (and probably compositors). If
> a problem arises which is better solved by streams (audio nodes? Who
> knows) then a stream data type can be used for that. The unified
> system wouldn't care whether the data represented functions or
> streams, it would just return an instance of it and then you can
> evaluate it until the cows come home. The important thing is that all
> data types can coexist within any tree.
>
>> Correct me, if I am wrong, because I
>> don't know for sure how current Texture Nodes do evaluation, but isn't
>> there a problem already?
>> Nodes there define a single function R,R->R which can evaluate the
>> texture node on a certain pixel. The entire output/final Texture is
>> then evaluated Pixel, by Pixel. This can also be described as, Texture
>> Nodes pass Pixels from node to node, while Composite Nodes pass whole
>> Images. But it won't be enough to just process single Pixels in the
>> pipeline, you would need to make the entire Image/Evaluated-texture
>> available for each node, before it can determine any of it's output
>> Pixels. Otherwise stuff like convolution, etc. isn't possible.
>
> That's actually not how texture nodes work at the moment -- the
> current texture nodes use a struct called tex_delegate and are really
> represented more as functions. The problem you describe about making
> the entire texture available was the reason I changed to the current
> system, because otherwise rotation and scale nodes (and, as you said,
> convolution) simply aren't possible. So, if I understand you
> correctly, that particular problem is already solved.
>
> -Rob
>
> 2009/6/14 Aurel W. <aurel.w at gmail.com>:
>> Hi,
>>
>> I also think that Nodes would need some redesign, unification, also to
>> share functionality and to be able to bring more stuff to Nodes (I am
>> especially looking here on animation and logic). A new branch would be
>> great imho.
>>
>>>> Main points
>> Pretty much detailed ones already, but I would rather start with more
>> basic design. (It would make sense to think of designing a stream
>> oriented programming language from the ground up)
>>
>> You mentioned,...
>>>> * Expand nodes' data types from (float, vector, color) to include
>>>> functions and other types
>>>> * Define a shader to be a function of a ShaderCallData
>>>> * Define a texture to be a function of a TexCallData
>> I would rather stick to a data stream model, functions as data type
>> doesn't seem to be a good idea. Correct me, if I am wrong, because I
>> don't know for sure how current Texture Nodes do evaluation, but isn't
>> there a problem already?
>> Nodes there define a single function R,R->R which can evaluate the
>> texture node on a certain pixel. The entire output/final Texture is
>> then evaluated Pixel, by Pixel. This can also be described as, Texture
>> Nodes pass Pixels from node to node, while Composite Nodes pass whole
>> Images. But it won't be enough to just process single Pixels in the
>> pipeline, you would need to make the entire Image/Evaluated-texture
>> available for each node, before it can determine any of it's output
>> Pixels. Otherwise stuff like convolution, etc. isn't possible.
>>
>> I got some ideas on nodes too, to share, but it's late here, so I will
>> get back to it later ;)
>>
>> Aurel
>>
>>
>> 2009/6/14 Thomas Dinges <dingto at gmx.de>:
>>> +1
>>> That is definetely a step into the right direction!
>>>
>>> Robin Allen schrieb:
>>>> Hi all, I hope this is the right list.
>>>>
>>>> After hearing Ton say that nodes might see a recode, and knowing that
>>>> users are sometimes frustrated by Blender's strict separation of tree
>>>> types, I thought about ways to change how nodes are evaluated to let
>>>> users use any nodes in any tree. I've put my ideas up at
>>>> http://wiki.blender.org/index.php/User:Frr/NodeThoughts . I'd be
>>>> willing to take this project on if people feel the design is up to
>>>> scratch, perhaps developing in a branch akin to bmesh.
>>>>
>>>> Main points:
>>>>
>>>> * Expand nodes' data types from (float, vector, color) to include
>>>> functions and other types
>>>> * Define a shader to be a function of a ShaderCallData
>>>> * Define a texture to be a function of a TexCallData
>>>> * Allow the user to specify any nodetree outputting a shader to be
>>>> used as a material tree; any tree outputting a texture to be used as a
>>>> texture tree; etc.
>>>> * Define implicit conversions allowing nodes (e.g. Invert) to be
>>>> defined once to work on colors, and then be automatically converted to
>>>> work on textures and shaders (since both are defined as functions
>>>> returning colors).
>>>> * Results in an extensible node system: instead of defining a new tree
>>>> type, just define a new data type and some nodes that work on it.
>>>> * No more duplication of code with tiny changes (math, image...)
>>>>
>>>> I'd like to hear any comments or criticisms you might have.
>>>>
>>>> -Rob
>>>> _______________________________________________
>>>> Bf-committers mailing list
>>>> Bf-committers at blender.org
>>>> http://lists.blender.org/mailman/listinfo/bf-committers
>>>>
>>>>
>>>
>>> _______________________________________________
>>> Bf-committers mailing list
>>> Bf-committers at blender.org
>>> http://lists.blender.org/mailman/listinfo/bf-committers
>>>
>> _______________________________________________
>> Bf-committers mailing list
>> Bf-committers at blender.org
>> http://lists.blender.org/mailman/listinfo/bf-committers
>>
> _______________________________________________
> Bf-committers mailing list
> Bf-committers at blender.org
> http://lists.blender.org/mailman/listinfo/bf-committers
>


More information about the Bf-committers mailing list