[Bf-committers] Does Cycles waste half it's possible performance or am i wrong?

Tobias Oelgarte tobias.oelgarte at googlemail.com
Sun Oct 28 12:16:22 CET 2012


Am 28.10.2012 03:19, schrieb Brecht Van Lommel:
> In fact this method can reduce noise in some situations, since uneven
> sampling of the pixel filter can introduce noise. It also avoids
> rendering padding pixels on tiles which cost render time too.
> http://lgdv.cs.fau.de/publications/publication/Pub.2006.tech.IMMD.IMMD9.filter/
>
That is correct. But there are some requirements to be fulfilled to get 
better results or to utilize its advantages. The first thing is that 
this filter is optimized for importance sampling, because importance 
sampling in combination with a partially negative filter (like mitch) 
increases noise. But so far Cycles does not use importance sampling 
(more samples for unstable/critical regions) and does not utilize the 
possible advantage this sampling method has. Compared to box-filtering 
(1.0x1.0) the gauss filter acts like a sharpening filter in this 
scenario, because most samples are taken close to the center of a pixel. 
But you will also catch data from outside this pixels area/frustum, 
which ideally would belong to another pixel. Despite having sampled this 
area it does not contribute to the color of the rightful pixel. So it is 
basically smoothing at the cost of correctness, without having an error 
reduction (you still get errors from the samples which don't belong to 
the pixel). That would not be the same as using a 1x1 box filter and 
then bluring the final image, taking the number of samples into account.

> But regarding the 2x less noise, there's two wrong assumptions. First
> is that you can't just average 1 and 4, it would be closer to
> something like 1.333x I think (the fireflies at the center of the
> pixel will stile take long to get rid of even if some others get
> easier).

Thats right. But you will also include fireflies/errors (more seldom of 
course) which aren't actually meant to be in this pixel. So you get a 
slight blur, but without averaging the error (no contribution to 
rightfull pixels). I'ts more like sampling a 1.333 times bigger image, 
which is effectively 1,78 (1,333²) times larger if you count the pixels.

> Also, the shape of the gaussian function makes it so that for typical
> filter widths most of the samples contribute little to other pixels,
> which lowers the noise reduction further. At the corners they
> contribute evenly yes, but there's no linear transition to the center.

No. It does not. This is no actual noise reduction. You will just leave 
out parts of the scene that contribute less to the pixels. If you make 
the filter very small, then you only sample the center of the pixels, 
ignoring the area (the gap between frustums) in between pixels, which 
leads to moire patterns. If you make it large enough to have an even 
contribution, then it will already waste samples that could contribute 
to other pixels. If you make it very large, then you waste even more and 
get unsharp results, without much noise reduction.

Using an area average filter (ratio of the areas that are covered by a 
circle [r=0.5px] around the ray-point on the film) should not do this. 
It affects usually four pixels at a time (except center case), canceling 
out the errors for any of those pixels per ray. You actually act as if 
you would give the ray some thickness. Using tiles (to save memory) you 
would just need to add RGB on the film and increase the area coverage by 
the ratio. At the end you can divide RGB separately trough the are 
coverage to have the final result. To avoid overflow you can reset 
pixels with high values and merge/divide them to a second RGB/Ratio 
plane. That way you can minimize the needed divisions.

Tobias




More information about the Bf-committers mailing list