[Bf-committers] Blender and QMC sampling

Matt Ebb matt at mke3.net
Sun Jun 13 02:44:29 CEST 2010


Simply put, the qmc stuff currently in blender's ray shading code is not much more than a fancy random number generator, that generates quasi-random samples in the unit square. These sample locations are then transformed into other representations such as vectors in which direction to trace rays.

It doesn't really mean anything much relative to the task of implementing stochastic motion blur/depth of field in blender's rasteriser - for that you're talking rearranging the deepest guts of the renderer pipeline, either to something like reyes, or pure raytracing, or some kind of hybrid rasterisation approach - in any case it would be a very big, complicated job, and one that is not really made any easier at all by having a slightly better random sample generator available :)

cheers

Matt



On 13/06/2010, at 10:04 , Magnus Löfgren wrote:

> Hi,
> 
> Sorry for the endless curiosity around motion blur and depth of field in
> blender.
> 
> Blender has Adaptive QMC for things like ambient occlusion, glossy
> reflections and area lights.
> What's the showstopper from implementing motion blur and depth of field
> using this technique? Are there any specific limitations here? I know it's
> probably time consuming but I'm refering to actual limitations in the render
> arcitecture.
> 
> Would rendering transform motion blur using instances interpolated with
> Adaptive QMC be impossible with the current implementation? This wouldn't
> account for deformation blur of course, but it's a small step atleast.
> Heard some mention of shading and sampling having to be decoupled, is this
> the one?
> 
> Thanks
> _______________________________________________
> Bf-committers mailing list
> Bf-committers at blender.org
> http://lists.blender.org/mailman/listinfo/bf-committers



More information about the Bf-committers mailing list