[Bf-cycles] Varying number of samples per pixel

Jaros Milan milan.jaros at vsb.cz
Sat Jun 24 09:12:24 CEST 2017


Hi,

you can try to look at this patch: https://developer.blender.org/D2662.

For example: I can get the rendering time from 8h to 8min by 99% quality.

Best regards
Milan Jaros



Dne 24. 6. 2017 1:44 napsal uživatel Escot Lucas <lucas.escot at ens-lyon.fr>:

Thanks to everyone for answering, I'm now looking at Lukas Stockner's work to see what could be used for my needs.

Now regarding the following remarks:

I would avoid discarding samples as you will waste compute power, just find a way to reduce samples from the start


It's difficult to tell without implementation details, what is the exact reason samples must be discarded? Transporting rays to another surface sounds pretty similar to BSDF scattering, where we use importance sampling and divide the throughput by the probability density function (pdf). Basically rather than counting the number of valid samples, we increase the weight of the valid samples to compensate for the invalid ones.

Sadly I can't delve too much into implementation details (it's part of an internship and I don't own the code), neither can I be too explicit about the transportation work.

But in a more abstract manner, imagine a setting in which paths that intersect an arbitrary object are allowed to exist only if the end of such a path obeys a specific pattern (... -> diffuse -> light for example). At the intersection with said object, during path construction, I have no other option than to keep building the path until I know for sure such a path is valid or not.

The difference with BSDF sampling is that in the latter we try to avoid unlikely paths, yet when they still occur, they are valid i.e physically correct. In my – I admit very specific – case, when we have constructed a path that does not match the end constraint, it is not unlikely per se, it simply shouldn't exist.

Again, sorry for not being able to explain much about the justification for wanting to discard paths.

Lucas Escot.

Le 2017-06-23 20:33, Brecht Van Lommel a écrit :

Storing the number of samples is usually not needed, and often impractical to combine with other sampling decisions taken along the path.

It's difficult to tell without implementation details, what is the exact reason samples must be discarded? Transporting rays to another surface sounds pretty similar to BSDF scattering, where we use importance sampling and divide the throughput by the probability density function (pdf). Basically rather than counting the number of valid samples, we increase the weight of the valid samples to compensate for the invalid ones.


On Fri, Jun 23, 2017 at 6:59 PM, Mohamed Sakr <3dsakr at gmail.com<mailto:3dsakr at gmail.com>> wrote:
Hey,

the easiest approach is: store the number of samples per pixel in an image, divide the final image by the samples image.
I would avoid discarding samples as you will waste compute power, just find a way to reduce samples from the start (it needs a redesign to change how the kernel works), Lukas Stockner may have more idea about this as he tried to do an adaptive sampler before I guess.

cheers,
Mohamed Sakr

On Fri, Jun 23, 2017 at 1:51 PM, Escot Lucas <lucas.escot at ens-lyon.fr<mailto:lucas.escot at ens-lyon.fr>> wrote:

Hello everyone,

I originally asked the following question on the #blendercoders IRC but was advised to send it here.

Context: I am currently working on a modified version of Cycles, in which the user have control over how light paths behave: they are able to transport/move rays hitting a specific surface to another one.

Such a modification implies that during the path tracing process, some paths/samples must be discarded (to do that I simply don't add their contribution to the final image, in kernel_path.h/kernel_path_trace()).

It means some pixels are downsampled since some of their samples might be discarded, however Cycles still divides for each pixel the total contribution by the total number of samples which is assumed to be constant over the entire image.

Hence my question: is there a simple way to make this last operation (averaging pixel samples) rely on a local number of samples, for each pixel, rather than a constant amount of samples for the entire image?

I hope this is the right place to ask such a question, and that some of you will have an answer.

Regards.

--

Lucas Escot
L3 Informatique Fondamentale
ENS de Lyon
Telephone: 06 88 62 63 86

_______________________________________________
Bf-cycles mailing list
Bf-cycles at blender.org<mailto:Bf-cycles at blender.org>
https://lists.blender.org/mailman/listinfo/bf-cycles

_______________________________________________
Bf-cycles mailing list
Bf-cycles at blender.org<mailto:Bf-cycles at blender.org>
https://lists.blender.org/mailman/listinfo/bf-cycles

_______________________________________________
Bf-cycles mailing list
Bf-cycles at blender.org<mailto:Bf-cycles at blender.org>
https://lists.blender.org/mailman/listinfo/bf-cycles

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.blender.org/pipermail/bf-cycles/attachments/20170624/2dcb1624/attachment.htm 


More information about the Bf-cycles mailing list