[Bf-committers] Multisample Z-buffer

Joe Eagar joeedh at gmail.com
Sun Jan 21 04:06:00 CET 2007

Reuben Martin wrote:
> Back on Saturday 20 January 2007 17:12, Joe Eagar was like:
>> Reuben Martin wrote:
>>> Right now the trick is to have the Z-combine node do the anti-aliasing
>>> for us. In the documentation Ton writes: "Note that Z values are still
>>> aliased, only 1 sample per pixel for Z is delivered to the compositor, so
>>> the masks can have small artifacts."
>>> My question is this: would it be possible to add a rendering option so
>>> that elements such as the z-buffer are rendered at a higher resolution in
>>> order to provide more than 1 sample per pixel? For instace you could have
>>> an option to render these elements at a scale factor of 2, 3, 4.... and
>>> get 4, 9 ,16... z-buffer samples per pixel respectively. My thinking is
>>> that maybe this would result in less artifacts and more precision. (Which
>>> would be especially helpful when doing DOF in post) Is this thinking
>>> correct?
>> The problem here is that AA isn't just rendering pixels of an image
>> 5,8,11,6 times bigger then resampling down; you actually render each
>> pixel offset by a jitter value.  Really simple down sampling doesn't
>> work nearly as well, especially on near-vertical lines.
> Well, my thinking isn't that this would be a replacement for AA, but rather 
> that a higher resolution z-buffer would provide sub-pixel precision for the 
> AA masking so that when "a post-process function is used which fills in the 
> jaggies with interpolated values" then it could use this sub-pixel 
> information in order to create more accurate interpolation.
> -Reuben
Oh I get you now.  It could even be an irregular zbuffer, so it'd 
perfectly match the multisamples of the original image.  Though either 
simple or irregular, it'd take a lot of memory of course :)


More information about the Bf-committers mailing list