[Bf-cycles] Feature: Recoloring of lights and materials post-render.

Peter Schmidt-Nielsen schmidtnielsenpeter at gmail.com
Thu Oct 11 23:59:22 CEST 2018


Hi Lukas,

Your suggestion sounds like a great compromise version; if I understand
what you're suggesting, the user literally just gets post-multiplication of
the entire BsdfEval with a single post-render tunable parameter. This
entails the simple recoloring cases I was suggesting, and also avoids
issues of having to do complicated calculations on the nodes graph to
determine if a usage of a retunable parameter is actually affine. I don't
know why I didn't think of it myself.

You write "*If it's fine with you, I'd like to play around with this a bit
today and see if I can get it working.*" <-- I don't know if this is what
you're asking, but of course I would in no way feel "scooped" if you just
went ahead and implemented an idea similar to this. I played a little bit
with adding this feature by expanding the PathRadiance, and making the
throughput track an exponent for each tweakable parameter, and then
modifying the path_radiance_accum_* functions to write into the right
buffer (or "term" in the polynomial), but I'm sure that you folks would be
100x faster at figuring out how to implement this in the existing code
base, and it might not be worth the time explaining the innards of Cycles
to me so I can bumble through it, compared to just doing it yourself.

I'd like to make a defense of the multi-parameter shader relighting case,
insane as it may be. You're right that it's bad in terms of memory; If we
want to recolor n material parameters while tracking a maximum degree of d
we need $\binom{n + d}{d}$ render buffers. However, I think it would be
pretty reasonable to let the user tune the maximum tracked degree. For
example, with a maximum degree of 6 the user can have 4 different tweakable
parameters rendering at 1920x1080 with only 4.9 GiB of render results. If
the maximum degree is reduced to 3 the user can have 10 different tweakable
parameters with only 6.6 GiB of render results. This is a lot, but it seems
not entirely insane in an era of 11 GiB cards everywhere. Nvidia explicitly
spoke of truncating the degree that is tracked (where choosing any degree
less than the bounce count makes the result an approximation, rather than
exact), and I have a hunch that a degree like 3 would typically be pretty
sufficient.

I think it's also interesting to imagine not tracking every term in the
multivariate expansion. For example, if an object with parameter x has very
limited visibility with an object with parameter y then there's probably no
need to track the x^2 y^2 term, even though you might want to track the x^4
and y^4 terms. One could imagine automatically pruning the space of terms
in the polynomial (and thus the number of render buffers required) by doing
a few initial passes at a couple of samples at a reduced resolution,
computing the total energy in each coefficient in the polynomial, and then
dropping the terms that are lowest energy. With a technique like this I
think you could have a pretty reasonable user experience and not run out of
VRAM even with a dozen or more relightable material parameters. I suspect
that typically the vast vast majority of the energy will be accumulated in
just a handful of terms, and if such an automatic pass could identify those
terms it's reasonable to relight a large number of materials.

Although, now I'm adding *even more* complexity, so this is probably not a
reasonable first version of such a feature. :)

I am also pretty interested in thinking about the case with a "Relightable
RGB" node that can be used anywhere in a material, with a pass that
automatically determines if the usage is affine, and computes the
y-intercept and slope of the BSDF w.r.t. that parameter. The nice thing
about this is that it unifies the user experience of the light relighting
feature and material relighting feature; in both cases the user just plugs
in a "Relightable RGB" node where they would have otherwise used a regular
RGB node.

If you do get something working I'd love an explanation of how you did it
(if you have time), and maybe some pointers on where I could potentially
start poking to consider implementing some of the stretch-goal features
(tuning multiple materials, arbitrary affine materials, automatic term
pruning, etc.), and also your thoughts on which stretch-goals are even
reasonable to implement.

Thanks,

-Peter Schmidt-Nielsen

On Thu, Oct 11, 2018 at 11:55 AM Lukas Stockner <lukas.stockner at freenet.de>
wrote:

> Quick followup:
>
> I just read the linked document and I understand the code complexity
> concern a bit better now. While it would be neat to have support for 100%
> perfect recoloring for crazy shader networks and arbitraty parameters, I
> agree that the solution suggested in the document is a bit much, at least
> for an initial patch.
>
> For now, I'd suggest a simplified version: The user can select a (or maybe
> a list of) BSDF node(s) and will later be able to tweak a parameter and get
> a result as if each selected BSDF had been multiplied by that parameter.
> That still lets you tweak textured objects and is straightforward to
> implement (single bit per closure, maybe just add a second SVM closure
> instruction to mark it).
>
> - Lukas
>
>
> On 10/11/18 8:29 PM, Lukas Stockner wrote:
>
> Hi,
>
> I think this is definitely an interesting feature. I do think it's worth
> the effort to look into this, since:
>
> - The coding effort shouldn't be too high (see below)
>
> - While it might not be the best choice for live material tweaking (I
> agree that fast viewport rendering is the way to go there), the
> applications in design visualization are great - if you need renders of 20
> different color variations of some furniture for a catalog, just render
> once and do it in the compositor. It could even be interesting as a data
> source for e.g. online configuration tools - render in Cycles, export the
> layers, mix in realtime in OpenGL.
>
> For some more discussion of this, check out this BlenderArtists thread:
> https://blenderartists.org/t/changing-colors-in-the-scene-without-re-rendering/1117699
>
>
> In terms of implementation, as I said, it shouldn't be too hard. Cycles
> generally has normalized builtin closures that get multiplied with a weight
> to color them, so there is an obvious starting point for the
> implementation. I'd have to go through the math to be sure, but I think
> just writing one float3 per bounce into the render buffer would let us
> implement that without even increasing the size of the PathRadiance
> (actually, a similar approach should work for the lightgroup patch now that
> I think about it). One important limitation I'd like to point out is that
> supporting recoloring based on multiple tweakable parameters is not
> realistic - afaics, the amount of terms needed is O(d^n) where d is the ray
> depth and n the objects.
>
> On top of the Cycles changes, you'd need a compositor node to handle it.
> One usability consideration is that you'd have to plug N passes from the
> RenderLayer node to the Recolor node (for depth N). That is also a problem
> with Cryptomatte, we might want to implement some sort of "bundled" socket
> connection type for that kind of stuff in the Compositor, but that's kind
> of off-topic.
>
>
> If it's fine with you, I'd like to play around with this a bit today and
> see if I can get it working.
>
> - Lukas
> On 10/11/18 6:26 PM, Peter Schmidt-Nielsen wrote:
>
> Hi Brecht,
>
> Thanks for the very quick response.
>
> For lights groups, there is a patch here which we should get committed at
>> some point:
>> https://developer.blender.org/D3607
>>
>
> Very cool to see, I think I haven't been keeping up; I don't think this
> was there yet when I last was asking around about this in #blendercoders in
> August 2017.
>
> For changing shader colors, personally I'm not entirely convinced it's
>> worth the code complexity. The trend is towards interactive previews and
>> realtime raytracing, and with a finite amount of development time to me it
>> seems more useful to improve that workflow. The other reservation I have
>> about this is if it really works in the more complex scenes where almost
>> every surface is textured. You might have hair, SSS, volumes, transparency,
>> and it just gets harder and harder to do a meaningful approximation. But
>> it's the more complex scene where relighting is most useful.
>>
>
> I agree that the code complexity and implementation effort question is
> pretty important, especially if the the general trend is towards enabling
> an artistic workflow of rapidly tweaking material colors by simply having
> very fast previews. In a world where most scenes can render some reasonable
> approximation to the viewport nearly instantly this sort of feature is much
> less compelling, such as with Eevee, as Zauber points out. This seems like
> a very valid criticism, especially given limited development time.
>
> However, I think it is possible to do this sort of shader relighting even
> in arbitrarily complex scenes, for the simple algebraic reason that a
> path's throughput is a product of inverse PDFs and BsdfEvals that it
> accumulates, and therefore if we can write the BsdfEvals as affine
> functions of the (unknown at render time) material parameters, then the
> final PathRadiance will necessarily be a polynomial in these unknown
> parameters, regardless of how complicated any other BSDFs are, or if other
> complicated phenomena are modeled (SSS, volumetrics, etc.).
>
> To put it in other words, we are pretty limited in the sorts of material
> parameters we can relight (only those that result in an affine response on
> the material's BsdfEvals and inverse PDFs). However, we are unconstrained
> in what other phenomena occur elsewhere in the scene in *other*
> materials. For example, one can't relight the color of a volumetric
> scatter, or change a glossy BSDF's roughness, or change an SSS color,
> because this results in non-affine changes in the material's properties,
> and effects on throughputs. However, one *can* relight an object (whose
> material admits relighting) even in the presence of these other fancier
> materials on *other* objects. This is not an approximation: if done with
> LPEs in Iray, the relighting is exact, even with all these fancier
> materials elsewhere (not on the recolored object) in the scene.
>
> My suggested implementation even allows recoloring of textured objects
> that use normal maps, etc.; it's all fine so long as the recoloring is
> applied multiplicatively (in scene referred terms; see the example complex
> material from my write-up PDF). (Another valid criticism of my suggested
> feature is that it's unclear if multiplicatively recoloring an object in
> scene referred terms is an artistically useful thing to be able to do
> efficiently post-render.)
>
> Apologies for the long-winded explanation above. The above may have
> already all been obvious to you, but I figured I'd spell it out exactly,
> because it was not obvious to me just how powerful this technique is (and
> that it's generally exact, and not an approximation, even with complicated
> phenomena elsewhere in the scene) until I read the Nvidia paper and thought
> about it for a while.
>
> It really depends what this is intended to be used for though. For example
>> I can imagine shader relighting being greater for e.g. an interior design
>> application where you want to the user to be able to pick a custom color,
>> or you want to save render time when rendering many variations (maybe
>> that's the kind of thing iRay had in mind?). It's less obvious to me how it
>> fits into an artistic workflow, where you are generally changing many
>> settings at will, or navigating the viewport, and it's hard to pick a few
>> settings in advance that you want to get a quick preview of later. The
>> intended use cases should then also information how the UI works.
>>
>
> This also seems like a very valid criticism. My proposed shader relighting
> design would make it prohibitively expensive to be able to recolor more
> than a couple of different materials, and maybe that just doesn't fit that
> well into a typical artistic workflow.
>
> Thanks again for your quick reply. I'll probably poke around a little bit
> more, and look at the patch you linked, to see if I could plausibly even
> prototype the shader relighting.
>
> -Peter Schmidt-Nielsen
>
> On Thu, Oct 11, 2018 at 4:34 PM Peter Schmidt-Nielsen <
>> schmidtnielsenpeter at gmail.com> wrote:
>>
>>> Hello folks,
>>>
>>> I'd like to add a feature to Cycles (and Blender) to support post-render
>>> recoloring of lights, and limited recoloring of materials. Post-render
>>> recoloring of lights is a feature implemented in Maxwell, which they call
>>> "multilights". Limited recoloring of materials can be achieved in Iray
>>> using their Light Path Expressions. The idea is to support both kinds of
>>> post-render relighting with a simple design, although not support the full
>>> generality and flexibility of Iray's LPEs.
>>>
>>> I think Cycles' users would benefit enormously from having such a
>>> feature. The end user experience would be that a user could change color
>>> parameters in specially marked lights and materials after the render, and
>>> the render would update instantly to a completely accurate version of what
>>> the render would have yielded with the different color parameters.
>>>
>>> I have a description of my plan, which I wrote up about a year ago when
>>> I first started poking around Cycles:
>>>
>>> http://web.mit.edu/snp/Public/polynomialrelighting.pdf
>>>
>>> However, I'm not very familiar with the Cycles code base beyond the very
>>> small amount of poking I did a year ago, and would love some advice and
>>> direction on how feasible it is to implement this feature.
>>>
>>> A TL;DR for my write-up linked above:
>>>
>>> 1) One can implement Maxwell-style "multilights" by just having one
>>> render buffer for each relightable light, and writing path contributions
>>> into the appropriate light's buffer.
>>> 2) I think one can implement Iray style post-render changes to material
>>> colors without implementing a full-on LPE system, if you just track a
>>> little extra data with PathRadiances and throughputs, and have a bunch of
>>> extra render buffers.
>>>
>>> Is anyone else interested in working on such a feature, or at the very
>>> least giving me some advice on this?
>>>
>>> Thanks,
>>>
>>> -Peter Schmidt-Nielsen
>>>
>>> P.S. I mentioned this about a year ago in #blendercoders, but I got busy
>>> and didn't keep working on it. A few months ago someone in #blendercoders
>>> recommended that I ask here.
>>> _______________________________________________
>>> Bf-cycles mailing list
>>> Bf-cycles at blender.org
>>> https://lists.blender.org/mailman/listinfo/bf-cycles
>>>
>> _______________________________________________
>> Bf-cycles mailing list
>> Bf-cycles at blender.org
>> https://lists.blender.org/mailman/listinfo/bf-cycles
>>
>
> _______________________________________________
> Bf-cycles mailing listBf-cycles at blender.orghttps://lists.blender.org/mailman/listinfo/bf-cycles
>
>
> _______________________________________________
> Bf-cycles mailing listBf-cycles at blender.orghttps://lists.blender.org/mailman/listinfo/bf-cycles
>
> _______________________________________________
> Bf-cycles mailing list
> Bf-cycles at blender.org
> https://lists.blender.org/mailman/listinfo/bf-cycles
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.blender.org/pipermail/bf-cycles/attachments/20181011/4de04a42/attachment.html>


More information about the Bf-cycles mailing list