[Bf-vfx] Fwd: [Bf-committers] GSoC 2017: Camera breathing support

Dalai Felinto dfelinto at gmail.com
Tue Mar 28 11:01:13 CEST 2017


Hi,
I see this GSoC idea more for the rendering (Cycles) aspects of it -
although to integrate it with tracking would be a great bonus.

For regular lens that may not be remarkable, but anamorphic bokeh lens have
a very unique "signature" when switching its focal distance.

Anamorphic bokeh was implemented for Cosmos Laundromat - Mathieu (Cosmos
director - cc'ed here) was a big fan of realistic cinematography. Although
the bokeh helped with the final looks of the film, other aspects of
anamorphy were never tackled. Mainly lens breathing and lens flare.

I don't know how (and if) other software programs handle this, but a
realistic lens modelling may be able to address those:

* https://graphics.stanford.edu/wikis/cs348b-11/Assignment3
*
http://www.cs.virginia.edu/~gfx/courses/2005/ImageSynthesis/assignments/camera.html

Regards,
Dalai

--
blendernetwork.org/dalai-felinto
www.dalaifelinto.com

2017-03-27 21:41 GMT+02:00 Sean Kennedy <mack_dadd2 at hotmail.com>:

> I myself have never needed this feature. I checked with one of the
> trackers here where I work, and he said he hasn't done much of that
> (tracking the focus of a shot, if I am understanding the documents
> correctly).
>
>
> Things I think would be more useful to the community at large would be:
>
>
> Automatic tracking - While not useful for all shots, for basic shots like
> aerial flyovers or simple handheld shots, this would be a time saver.
>
>
> Easier rebuilding of geometry - I know we have the "3d markers to mesh"
> button, but it simply creates vertices, which then have to be manually
> stitched together to create rough geo. There's gotta be an easier way to
> get rough scene geo. Even if it's only updating that button to create
> vertices for selected tracks only. Or being able to build geometry after
> the camera solve by specifying a few points, then moving to a different
> frame and re-specifying those same points. The solve should be able to
> rebuild that geo correctly from just that small amount of information.
>
>
> Easier planar tracking - Planar tracking where we can, for example, simply
> draw a grease pencil stroke around a flat, planar area, and have that area
> tracked throughout the shot to easily stick a plane track on to.
>
>
> Obviously I can't speak for everyone, but these are the things that would
> be the most helpful for how I use tracking and solving in Blender here at
> work every day.
>
>
> Sean
>
>
> ------------------------------
> *From:* bf-vfx-bounces at blender.org <bf-vfx-bounces at blender.org> on behalf
> of Ton Roosendaal <ton at blender.org>
> *Sent:* Monday, March 27, 2017 11:12 AM
> *To:* bf-vfx at blender.org
> *Subject:* [Bf-vfx] Fwd: [Bf-committers] GSoC 2017: Camera breathing
> support
>
> Hi,
>
> FYI. A student proposal.
>
> I can't judge this feature well, please feedback.
> Is it really essential? Other ideas he could work on?
>
> -Ton-
>
> --------------------------------------------------------
> Ton Roosendaal  -  ton at blender.org   -   www.blender.org
> <http://www.blender.org/>
> blender.org <http://www.blender.org/>
> www.blender.org
> Home of the Blender project - Free and Open 3D Creation Software
>
>
> Chairman Blender Foundation, Director Blender Institute
> Entrepotdok 57A, 1018 AD, Amsterdam, the Netherlands
>
>
>
> > Begin forwarded message:
> >
> > From: Tianwei Shen <shentianweipku at gmail.com>
> > Subject: Re: [Bf-committers] GSoC 2017: Camera breathing support
> > Date: 27 March 2017 at 19:06:37 GMT+2
> > To: bf-blender developers <bf-committers at blender.org>
> > Reply-To: bf-blender developers <bf-committers at blender.org>
> >
> > Hi all,
> >
> > FYI, you can check out my proposal draft on this project at
> https://docs.google.com/document/d/1J0iFVL45Ha_rFcXtO_
> fGQdP5hqdPKVQH2P-DYFuAXgk/edit <https://docs.google.com/
> document/d/1J0iFVL45Ha_rFcXtO_fGQdP5hqdPKVQH2P-DYFuAXgk/edit>, if you are
> interested.
> >
> >
> > Thanks,
> > Tianwei
> >> On Mar 25, 2017, at 12:08 AM, Jacob Merrill <blueprintrandom1 at gmail.com>
> wrote:
> >>
> >> what about using a object of known scale to calibrate (like a 3d printed
> >> susan?)
> >>
> >>
> >>
> >> On Fri, Mar 24, 2017 at 8:27 AM, Tianwei Shen <shentianweipku at gmail.com
> <mailto:shentianweipku at gmail.com <shentianweipku at gmail.com>>>
> >> wrote:
> >>
> >>> Hi Levon,
> >>>
> >>> Thank you so much this long reply. First of all, I’ve been looking for
> >>> user tests and suggestions for the multi-view reconstruction project.
> If
> >>> you have ideas for making it better, just feel free to drop me emails.
> On
> >>> the other hand, it is still a quite large patch. So we need time to
> split
> >>> it up and gradually merge it into the master. But hopefully this can be
> >>> integrated well with the camera breathing support project and even
> >>> automatic tracking in the future.
> >>>
> >>> As for the camera breathing support, I didn’t realize lens distortion
> >>> parameters would also change with the focal lengths. I thought we’d
> only
> >>> deal with changing focal lengths with the zoom-in/out motions. So
> things
> >>> become complicated here since it seems to me that focal lengths and
> >>> distortion parameters cannot be estimated on the fly. Users have to
> first
> >>> calculate this information (focal lengths for each frames and their
> >>> corresponding lens distortion parameters) using some calibration
> tools. Can
> >>> the solver reliably deal with changing focal lengths and distortions?
> On
> >>> the other hand, if users have to first calculate focal distances using
> some
> >>> tools (if Blender doesn’t have its own) in the first place, would it
> impose
> >>> burden and inconvenience for users?
> >>>
> >>>
> >>> Thanks,
> >>> Tianwei
> >>>
> >>>> On Mar 24, 2017, at 8:57 PM, Levon <levonh at gmail.com> wrote:
> >>>>
> >>>>>
> >>>>> Message: 1
> >>>>> Date: Fri, 24 Mar 2017 02:26:38 +0800
> >>>>> From: Tianwei Shen <shentianweipku at gmail.com <
> mailto:shentianweipku at gmail.com <shentianweipku at gmail.com>> <mailto:
> >>> shentianweipku at gmail.com <mailto:shentianweipku at gmail.com
> <shentianweipku at gmail.com>>>>
> >>>>> Subject: [Bf-committers] GSoC 2017: Camera breathing support
> >>>>> To: bf-blender developers <bf-committers at blender.org <
> mailto:bf-committers at blender.org <bf-committers at blender.org>> <mailto:
> >>> bf-committers at blender.org <mailto:bf-committers at blender.org
> <bf-committers at blender.org>>>>
> >>>>> Message-ID: <3B4BA086-116B-417D-A288-C8F1CA7A880F at gmail.com <
> mailto:3B4BA086-116B-417D-A288-C8F1CA7A880F at gmail.com
> <3B4BA086-116B-417D-A288-C8F1CA7A880F at gmail.com>> <mailto:
> >>> 3B4BA086-116B-417D-A288-C8F1CA7A880F at gmail.com <
> mailto:3B4BA086-116B-417D-A288-C8F1CA7A880F at gmail.com
> <3B4BA086-116B-417D-A288-C8F1CA7A880F at gmail.com>>>>
> >>>>> Content-Type: text/plain;       charset=us-ascii
> >>>>>
> >>>>> Hi Everyone,
> >>>>>
> >>>>> Last summer I participated in GSoC 2016 and worked on the multi-view
> >>>>> camera reconstruction project. Some of my efforts are summarized in
> this
> >>>>> blog: http://hlzz.github.io/blender3/ <http://hlzz.github.io/
> blender3/> <http://hlzz.github.io/blender3/ <http://hlzz.github.io/
> blender3/>>
> >>> <http://hlzz.github.io/blender3/ <http://hlzz.github.io/blender3/> <
> http://hlzz.github.io/blender3/ <http://hlzz.github.io/blender3/>>>.
> >>>>> And this patch (https://developer.blender.org/D2187 <
> https://developer.blender.org/D2187> <
> >>> https://developer.blender.org/D2187 <https://developer.blender.
> org/D2187>> <
> >>>>> https://developer.blender.org/D2187 <https://developer.blender.
> >>> org/D2187>>) is now being reviewed and revised.
> >>>>> This year I would like to apply again and work on the camera
> breathing
> >>>>> support, which is already requested by some users during the time I
> >>> worked
> >>>>> on the motion tracking project. Now I need clarifications for some
> >>> specific
> >>>>> problems.
> >>>>>
> >>>>> 1. should we automatic detect the changes of focal lengths, or is it
> >>>>> specified by users as additional inputs (like the focal length for
> each
> >>>>> frame)? I know we can read exif tags to get focal lengths for photos.
> >>> Do we
> >>>>> have a similar approach for videos?
> >>>>>
> >>>>> 2. Is the current UI able to handle camera breathing, if we need
> >>>>> additional inputs from users?
> >>>>>
> >>>>> I think this project also has something to do with my revisions done
> on
> >>>>> the motion tracking system last summer. Hopefully I should be able to
> >>> merge
> >>>>> the revisions and move towards the goal of automatic tracking.
> >>>>>
> >>>>>
> >>>>> Thanks,
> >>>>> Tianwei
> >>>>>
> >>>>
> >>>
> >>> _______________________________________________
> >>> Bf-committers mailing list
> >>> Bf-committers at blender.org <mailto:Bf-committers at blender.org
> <Bf-committers at blender.org>>
> >>> https://lists.blender.org/mailman/listinfo/bf-committers <
> https://lists.blender.org/mailman/listinfo/bf-committers>
> >>>
> >> _______________________________________________
> >> Bf-committers mailing list
> >> Bf-committers at blender.org <mailto:Bf-committers at blender.org
> <Bf-committers at blender.org>>
> >> https://lists.blender.org/mailman/listinfo/bf-committers <
> https://lists.blender.org/mailman/listinfo/bf-committers>
> > _______________________________________________
> > Bf-committers mailing list
> > Bf-committers at blender.org
> > https://lists.blender.org/mailman/listinfo/bf-committers
>
> _______________________________________________
> Bf-vfx mailing list
> Bf-vfx at blender.org
> https://lists.blender.org/mailman/listinfo/bf-vfx
>
> _______________________________________________
> Bf-vfx mailing list
> Bf-vfx at blender.org
> https://lists.blender.org/mailman/listinfo/bf-vfx
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.blender.org/pipermail/bf-vfx/attachments/20170328/c1f4cb7f/attachment.htm 


More information about the Bf-vfx mailing list