[Bf-vfx] Blender camera tracking approach

Tom M letterrip at gmail.com
Sun May 8 23:12:07 CEST 2011


And of course i missed a few things

optional step of providing photos from different perspectives to help
solve camera tracks with very little parallax

optional step of providing hints to the camera about lens behaviors
(ie a zoom or other changes) in addition to the camera physical motion

optional step of providing hints to the camea about lighitng behaviour
changes (this could also be done as a color correction instead of
involving the tracker).

On Sun, May 8, 2011 at 1:05 PM, Tom M <letterrip at gmail.com> wrote:
> Tobias,
>
> here is a 'complete' workflow for tracking, it might be good to
> provide details for some of your flow areas off to the side.
>
> 1) load image data - video or sequence of images
> 2a) (optional) correct rolling shutter
> 2b) (optional) input known camera parameters
> 2c) (optional) provide hints to camera lens undistortion solver
> (identify straight lines in the image) to help with undistorting the
> image
> 2d) (optional) undistort the camera lens using a distortion solver
> 3) (optional) create mask for data you don't want tracked (or for
> 'grouping'/seperating of tracking ie tracking the motion of a camera
> and tracking the motion of an object)
> 4) (optional) adjust color/contrast/etc so that feature points to
> track have increased contrast and are thus easier for the tracker to
> find them (and easier for the tracker to ignore stuff you don't want
> tracked - ie uneven lighting on a green screen creating spurious
> motion)
> 5) (optional) select specific features you want tracked (either by
> mask or placement of tracking markers)
> 5a) if you specified specific features to the tracker you may also
> want to specify how the points should be tracked (bounding box for the
> tracker to look for the point in; whether the object being tracked is
> 'rigid' or can deform, and knowledge of the types of camera/object
> motion - object translation; rotation; camera translation; rotation)
> 6) send image data, specified track data; camera data; and mask data to tracker
> 7) tracker library can automatically identify feature points to track
> and or use user specified features, ignoring areas that are masked
> 8) do a 'solve' of the camera motion based on the track points,
> including statistics on how 'good' the tracks are in contributing to
> the solve
> 9) return track point data and camera solve to software, including the
> statistical analysis of the track points and their 3d projection
> 10) based on the statistical analysis pick error thresholds for what
> track points to automatically delete (automatic threshold picking or
> manually adjusted)
> 11) (optional) manually delete any track points
> 12) (optional) create a mask to 'hide' unwanted track points
> 12a) (optional) mask can be assigned to follow a set of track points
> to automatically mask a moving object from the tracker/solver
> 12b) (optional) mask can be manually keyframed so that it moves and
> deforms over time to mask a moving object from the tracker/solver
> 12) (optional) provide a manually created camera curve to 'hint' the
> tracker/solver what you expect the actual curve to look like
> 13) retrack if additional tracker points are now needed
> 14) pass the tracker points, camera hints, etc. to the camera solver
> 15) return the solved camera track and the 3d location of the track
> points to the software
> 16) visualize the camera motion and track points in the 3d view
> 17) define a ground plane reference to the 3d track points and camera
> 18) define the scene origin relative to the 3d track points and camera
> 19) define the world scale and orientation relative to the 3d track
> points and data
> 20) add a test object into the 3d view and see if it stays in the
> proper location
> 21) (optional) stabalize the camera view based on the solved camera track
> 22) (optional) smooth the camera track curve
> 23) repeat until the error in the solve is good enough
>
> On Sun, May 8, 2011 at 12:33 PM, Tobias Kummer <supertoilet at gmx.net> wrote:
>> After some more tinkering, I came up with the following revised proposal:
>> http://www.pasteall.org/pic/12084
>>
>> This takes some of your (Remos) concerns into account.
>>
>> Regards,
>>
>> Tobi
>>
>> On 05/08/2011 10:09 PM, Remo Pini wrote:
>>> Hm... I have never noticed the slight distortions to cause issues with
>>> the tracker... might be a question of the implementation, but I don't
>>> see, why the material would have to be converted twice, even IF
>>> "undistorting" is required...
>>>
>>> (1) take original footage
>>> (2) undistort to track (temporary copy)
>>> (3) track
>>> (4) distort tracking data according to undistort
>>> (5) use original footage + track data + lensinfo from source (or
>>> undistort function)
>>>
>>> Syntheyes seems to do that under the hood, I guess, never seen tracking
>>> issues due to distortions...
>>>
>>>
>>>> -----Original Message-----
>>>> From: bf-vfx-bounces at blender.org [mailto:bf-vfx-bounces at blender.org]
>>>> On Behalf Of Tom M
>>>> Sent: Sonntag, 8. Mai 2011 8:20
>>>> To: Blender motion tracking&  VFX
>>>> Subject: Re: [Bf-vfx] Blender camera tracking approach
>>>>
>>>> On Sun, May 8, 2011 at 10:06 AM, Remo Pini<remo.pini at avexys.com>
>>>> wrote:
>>>>> I kind of disagree...
>>>>>
>>>>> If you have lens distortion in you footage, this would most likely
>>> be
>>>>> bad and you would not want the final result to have that in it
>>> anymore,
>>>>> sooo....
>>>> You are thinking extreme distortion - this is about the very slight
>>>> distortion that comes from all camera lenses due to their lense shape
>>>> (as well as lense imperfections - but the imperfections are mostly
>>>> ignored by most lense solvers).
>>>>
>>>> Our brains ignore the distortion but it confuses trackers since travel
>>>> in a straight line isn't straight with the distortion.
>>>>
>>>> Undistorting the lense reduces the image quality because it shifts
>>>> pixel locations and thus 'blurs' pixels that the undistorted location
>>>> of a pixel will not be exactly on the pixel boundary of a previous
>>>> pixel.
>>>>
>>>> Thus undistortion is only used to provide straighter lines to the
>>> tracker.
>>>> Applying the lens distortion to your rendered image is so that it's
>>>> straight lines are curved similarly to that of the original image.
>>>> Again this introduces blurring of your render if it is applied as a
>>>> post process - so either you need to render larger or do the
>>>> distortion at render time.  In practice the amount of bluring from
>>>> doing it as a post process is probably not significant enough to be
>>>> noticable in most situations.
>>>>
>>>> LetterRip
>>>> _______________________________________________
>>>> Bf-vfx mailing list
>>>> Bf-vfx at blender.org
>>>> http://lists.blender.org/mailman/listinfo/bf-vfx
>>> _______________________________________________
>>> Bf-vfx mailing list
>>> Bf-vfx at blender.org
>>> http://lists.blender.org/mailman/listinfo/bf-vfx
>>>
>> _______________________________________________
>> Bf-vfx mailing list
>> Bf-vfx at blender.org
>> http://lists.blender.org/mailman/listinfo/bf-vfx
>>
>


More information about the Bf-vfx mailing list