[Soc-2016-dev] Weekly Report #08 - Cycles Denoising

Lukas Stockner lukas.stockner at freenet.de
Sun Jul 17 04:50:09 CEST 2016


Hi!

First of all, regarding timing: I usually tried to finish one block of commits before sending the report. Of course, I can also always send them on Fridays and will start doing so next week.

Generally, the next two weeks are exam time at my University, so progress might be a little slow - I'll keep working on GSoC, though.

For the Denoising project, I spent most of the time working on improving detail preservation by using a shadow feature pass which is generated by tracking the fraction of direct light sampling shadow queries that were successful. That pass does not exactly work like a shadowcatcher since it doesn't account for different brightness of the lamps, but that's not needed for denoising - generally, some correlation between the feature pass and the image is enough.
However, the shadow pass (aka visibility pass in denoising papers) has a big disadvantage when compared to the other features: It's very noisy. LWR does account for noise in the feature passes, but only by disregarding them if they're noisy - which kind of makes them useless. If features with noise end up in the final regression step, the noise will directly carry over to the filtered image due to the noise in the features being correlated with the image (that effect can currently be seen in the front window of the BMW scene, for example).
But, there is a way around that problem: Prefiltering. Feature passes can be filtered way more aggressively than the final result since the user doesn't actually see them. A pretty smart approach to prefiltering is presented "Nonlinearly Weighted First-order Regression for Denoising Monte Carlo Renderings" (http://benedikt-bitterli.me/nfor/) - by splitting the sampling process into even and odd samples and generating two feature passes with one group of samples each, you get two unbiased estimates of the feature.
Now, the Non-Local Means filter works by comparing each small region around the current center pixel with the center region and calculating weights based on their similarity. Since both feature passes contain the same underlying data, you can use one of the passes to compute the weights for filtering the other one, which leads to great improvements in the filter quality.
But, if that approach works so well for decorrelating the filter weights with the filtered data, why not apply it to the actual image as well? Indeed, that's the core of that paper: Also splitting the regular image into two groups and denoising each of them based on the (prefiltered) feature passes of the other one.
In practise, there are a few problems: First of all, it's significantly slower. Also, prefiltering feature passes might lose small details. Also, there is a hidden issue with random numbers: While in theory, splitting the random samples in half produces two unbiased results, the low-discrepancy sequences like Sobol used for rendering aren't random and might cause artifacts.
Therefore, my current plan is to implement prefiltering only for the shadow feature for now. Later, it might be interesting to add that cross-filtering for the main image as an option (Quality-vs-Speed tradeoff checkbox). The prefiltering code isn't finished yet, but will hopefully be in a few days.

Apart from that, many users have reported large areas of the denoised image turning black. Unfortunately, I haven't been able to reproduce that effect with any of the shared .blends, but will keep trying to fix the problem.
I have been able to reproduce a bug where individual pixels in the feature passes are NaN. However, the bug behaves extremely weird - in Debug mode, the code is fine without triggering any FPEs. In ReleaseWithDebInfo mode, lines don't match the code at all, making it useless for debugging. Now, the only option most likely is to comment out parts of the code and see if the issue is gone...


Regarding other Cycles coding: While hunting for denoiser bugs, I found and fixed four numerical issues in the regular Cycles code - one in the Beckmann distribution code, one in a helper math function, one in the Normal Map node and one in the calculation of differentials after refraction.
Also, I wrote a quick implementation of the Cryptomatte system for Cycles (D2106): Its goal is to allow for flexible ID matte pass generation in a compositor. Currently, Cycles just saves the Object and Material ID of the first sample into the respective passes. That's a problem because it doesn't allow for anti-aliased ID mattes and fails when using DoF/MoBlur. With the new system, all IDs encountered in the pixel are tracked along with their weight (mainly, how often they appear). Then, multiple passes are produced, each containing the ID-Weight-pair with the highest weight, the second-highest one etc. In the compositor, these passes can then be combined to generate accurate ID mattes for any ID that work with transparency, anti-aliasing, DoF and MoBlur. D2106 also contains a Compositor node patch.
This code is not meant for master, but meant as a reference to see whether the feature is useful. Personally, I think that it's a pretty simple and elegant solution to the ID problem.

So, that's my report for the 8th week. I'll continue to track down bugs and finish the shadow feature and prefiltering, of course, and will hopefully have some nice denoised shadows in the next report!

Lukas

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 836 bytes
Desc: OpenPGP digital signature
Url : http://lists.blender.org/pipermail/soc-2016-dev/attachments/20160717/8783cfe6/attachment-0001.pgp 


More information about the Soc-2016-dev mailing list