[Bf-vfx] Blender camera tracking approach
Tobias Kummer
supertoilet at gmx.net
Sun May 8 15:49:10 CEST 2011
Hey all!
Here are some of my thoughts on the matchmoving workflow in blender,
after some discussion in IRC chat. Since I've never used Pfmatchit, I
cannot say anything about it, but when Syntheyes was mentioned I looked
into it a little and thought its workflow might be suboptimal.
It re-applies lens distortion on an image post-processing level, which
is bad quality-wise. A better way would be setting blenders camera
parameters, so the rendered output already has matching distortion and
does not have to be altered afterwards (which brings up the topic
blender real world camera settings - would be desirable anyway).
See this (crappy) flowchart as illustration:
http://www.pasteall.org/pic/12079
The first step in the proposed workflow accomodates for the fact that
blender cannot draw lens distortions in the 3D viewport. So the
outcoming undistorted footage is just used for visualisation purposes
for the artist - for rendering/compositing, the original distorted
footage would be used. This way, we skip the step that compromises quality.
As blender will have combined 3D/tracking functionality, we should
leverage on that and not copy the tracking-only workflow of other packages.
Please correct me if there are any major mistakes in my thoughts!
Greets,
Tobias
More information about the Bf-vfx
mailing list