[Bf-committers] Fwd: Relevant Siggraph paper for Durian Rendering request (true 3d motion blur)

Magnus Löfgren lofgrenmeister at gmail.com
Wed Feb 24 19:54:34 CET 2010


Hi,

First of all, it was the original link wich I though would be most
interesting, it is very in-depth: http://www.cs.columbia.edu/cg/mb/mb09.pdf

>* I suspect the kind of jittered sampling used for motion blur in other*>* renderers is not possible given Blender's rendering architecture.*>* Maybe if blender had an option to render with pure raytracing.  But*>* even then, there would have to be a lot of work put in to handle*>* moving geometry in an optimized way.*

If that is the case, what about some kind of hybrid method of interpolating
between the Full Scene motion blur samples? With some sort of spline based
interpolation between the passes?
It could possibly be applied to deep shadow maps as well, to render the
actual maps at the specified number of samples over time and then
interpolate between them, for proper motion-blurred shadows.
Possibly the same could be applied to reflection maps and animated texture
maps aswell.

Not sure how to approach proper motionblurred specular highlights though.

*> **The motion vector method is actually very close to what the vector*
*> blur node does in Blender. It includes some improvements like smarter*
*> filling of holes, but indeed also still requires the use of render*
*> layers to separate background from foreground..*

But vector blur simply isn't accurate enough for most situations, except for
objects moving along a relatively linear curve with minimal rotation.
Imagine a rapidly spinning object (say 360+ degrees over one frame), vector
blur doesn't produce even remotely accurate results, even if we're ignoring
overlapping geometry, shadows and reflections, you just can't tell by
looking at one isolated vector-blurred frame what the motion actually is,
unless it's rotating slowly and moving along a relatively linear path.

And while the "mblur" technique produces perfect, physically correct motion
blur, it's results are limited by the amount of samples, for very rapidly
moving or rotating objects (say a airplane propeller rotating at several
thousands of revolutions per minute), even a 32 pass full scene motion blur
will produce noticable strobing. And the implications of re-rendering a
frame 32 times are huge.


There are things happening on the horizon as far as 3rd party renders go,
Lux has a good motion blur implementation, but doesn't work on deformation,
and being an unbiased renderer it isn't exactly perfectly suited for most
film production scenarios :)
Se some of these examples from an artist exporting to Lux from Softimage
XSI:
http://blog.ioxu.com/wp-content/uploads/2009/12/room_39_01_motionblur_03.jpg
http://blog.ioxu.com/wp-content/uploads/2009/12/room_39_01_motionblur_06_bot.jpg
Yafaray migh get raytraced motionblur through GSoC, hopefully.
And blender -> renderman is moving ahead for 2.5 slowly @
ribmosaic at sourceforge.net

But it would be a fine thing to have for blender internal renderer,
especially for projects like Durian, or other future projects wich rely on
the blender internal renderer.


Not sure about if a hybrid 2d/3d motion blur would be possible to implement
in blender, but would be a good middlepoint between the two extremes in
regards to accuracy vs performance.
Wish I had the proper programming understanding and skills so that I could
contribute som of my time.

Best regards
Magnus

---------- Forwarded message ----------
From: Magnus Löfgren <lofgrenmeister at gmail.com>
Date: 2010/1/9
Subject: Relevant Siggraph paper for Durian Rendering request (true 3d
motion blur)
To: bf-committers at blender.org


Hi,

I saw on the Durian requests page that the need for True 3d Motion Blur (ala
Renderman, Mental Ray, Mantra, Vray, etc) has come up.

If someone decides to take on this task of implementing better true motion
blur for blender to replace the old "MBLUR" oversampling method, Ive found a
paper that is of interest from Siggraph 2009 with some smart tricks for
reducing the amount of samples needed.

The abstract and a download link to the paper in .pdf format can be located
here: http://www.cs.columbia.edu/cg/mb/
It is of interest to locate and study the references in the paper aswell,
all the way back to the first Siggraph paper on motion-blur in the
mid-eighties.

Mvh
Magnus


More information about the Bf-committers mailing list