[Bf-committers] GSoC 2017: Camera breathing support
blueprintrandom1 at gmail.com
Fri Mar 24 17:08:34 CET 2017
what about using a object of known scale to calibrate (like a 3d printed
On Fri, Mar 24, 2017 at 8:27 AM, Tianwei Shen <shentianweipku at gmail.com>
> Hi Levon,
> Thank you so much this long reply. First of all, I’ve been looking for
> user tests and suggestions for the multi-view reconstruction project. If
> you have ideas for making it better, just feel free to drop me emails. On
> the other hand, it is still a quite large patch. So we need time to split
> it up and gradually merge it into the master. But hopefully this can be
> integrated well with the camera breathing support project and even
> automatic tracking in the future.
> As for the camera breathing support, I didn’t realize lens distortion
> parameters would also change with the focal lengths. I thought we’d only
> deal with changing focal lengths with the zoom-in/out motions. So things
> become complicated here since it seems to me that focal lengths and
> distortion parameters cannot be estimated on the fly. Users have to first
> calculate this information (focal lengths for each frames and their
> corresponding lens distortion parameters) using some calibration tools. Can
> the solver reliably deal with changing focal lengths and distortions? On
> the other hand, if users have to first calculate focal distances using some
> tools (if Blender doesn’t have its own) in the first place, would it impose
> burden and inconvenience for users?
> > On Mar 24, 2017, at 8:57 PM, Levon <levonh at gmail.com> wrote:
> >> Message: 1
> >> Date: Fri, 24 Mar 2017 02:26:38 +0800
> >> From: Tianwei Shen <shentianweipku at gmail.com <mailto:
> shentianweipku at gmail.com>>
> >> Subject: [Bf-committers] GSoC 2017: Camera breathing support
> >> To: bf-blender developers <bf-committers at blender.org <mailto:
> bf-committers at blender.org>>
> >> Message-ID: <3B4BA086-116B-417D-A288-C8F1CA7A880F at gmail.com <mailto:
> 3B4BA086-116B-417D-A288-C8F1CA7A880F at gmail.com>>
> >> Content-Type: text/plain; charset=us-ascii
> >> Hi Everyone,
> >> Last summer I participated in GSoC 2016 and worked on the multi-view
> >> camera reconstruction project. Some of my efforts are summarized in this
> >> blog: http://hlzz.github.io/blender3/ <http://hlzz.github.io/blender3/>
> <http://hlzz.github.io/blender3/ <http://hlzz.github.io/blender3/>>.
> >> And this patch (https://developer.blender.org/D2187 <
> https://developer.blender.org/D2187> <
> >> https://developer.blender.org/D2187 <https://developer.blender.
> org/D2187>>) is now being reviewed and revised.
> >> This year I would like to apply again and work on the camera breathing
> >> support, which is already requested by some users during the time I
> >> on the motion tracking project. Now I need clarifications for some
> >> problems.
> >> 1. should we automatic detect the changes of focal lengths, or is it
> >> specified by users as additional inputs (like the focal length for each
> >> frame)? I know we can read exif tags to get focal lengths for photos.
> Do we
> >> have a similar approach for videos?
> >> 2. Is the current UI able to handle camera breathing, if we need
> >> additional inputs from users?
> >> I think this project also has something to do with my revisions done on
> >> the motion tracking system last summer. Hopefully I should be able to
> >> the revisions and move towards the goal of automatic tracking.
> >> Thanks,
> >> Tianwei
> Bf-committers mailing list
> Bf-committers at blender.org
More information about the Bf-committers