[Bf-committers] Re: Vectorial Motion Blur

Roland Hess me at harkyman.com
Sun Feb 13 18:11:40 CET 2005


Ton said:
> Getting a vector buffer rendered is not so difficult. Most problematic  
> is how to calculate good working filter/masks for reconstructing the  
> new image.
> Or said different; what is the mapping of the (2D should be OK) vector  
> to a filter? How does it look like, a very long line? Ovals?

GSR said:
> If you mean how things should look, like lines. The code I have
> around, once it has the vector, calculates N static images based in
> that vector distortion, accumulates the results and divides. But I
> guess painting vanishing lines should work too.

Real motion blur shows the object as a streak of uniform opacity. As 
this process is done on a pixel-by-pixel basis, a simple lowered opacity 
streak (opacity being inversely proportional to the length of the vector 
in the view plane) of the pixel along the vector should suffice.

My methodology copied and pasted from the wiki entry:

  2. Using the motion map to create a blurred image from the raw render

A. Once the motion map is generated, a new blank rgb image buffer is 
initialized (called the "blur buffer" here).

B. Areas indicated as sky by the z-buffer (or the alpha channel of the 
render buffer, whichever is more appropriate) are copied directly from 
the render buffer to the blur buffer, and if desired can be blurred 
based on a function of the camera's motion.

C. Proceeding forward through each level of the z-buffer, render buffer 
pixels from that level are evaluated one at a time, then added to the 
blur buffer.

    1. The motion vector from the motion buffer is recreated from the 
xyf information.
    2. The base pixel is copied from the render buffer to the blur buffer.
    3. The pixel is mixed into the blur buffer along the trajectory 
indicated by the motion vector, with opacity falling to 0 (transparent) 
at each end of the vector.

     * Note: I think that two things would need to be determined here by 
trial and error, coupled with visual inspection of output. First, the 
blending method. Blender has several already available, and we should 
let our eye determine which would produce the best result. Also, testing 
would be needed to determine the correct falloff function for opacity: 
linear, quad, etc.

D. The finished blur buffer could be copied back into the render buffer 
for display, posted to the secondary render buffer so the user has 
access to the original render as well as the blurred one, or simply held 
in the blur buffer, while granting the user access to save this as a 
separate process.

Roland Hess - harkyman


More information about the Bf-committers mailing list