[Bf-funboard] VSE , compositing, and 3d integration...

Troy Sobotka troy.sobotka at gmail.com
Sun Jul 7 08:58:25 CEST 2013


On Jul 6, 2013 7:21 PM, "David Jeske" <davidj at gmail.com> wrote:

> Blender's current UI suggests I should be able to make three 50-frame
> animation scenes (Cube, Sphere, Plane), and then make a fourth 150 frame
> scene "VSE Sequence" which brings them together into the VSE Clip Editor.
> In fact in the VSE I can "add scene" and the strips are brought in.

Or alternatively block in three 50 frame shots in your offline edit, craft
each 50 frame shot in the tools, and then online to them in your finisher /
conformer.

Again, this is precisely what an offline / online pipeline solves, and does
so in a non-locked in manner allowing for alternate tools and needs.

> The VSE "Add -> Scene.." suggests that I can add a bunch of
> scenes to the VSE and sequence them, but in practical terms it doesn't
> really work. This would be really nice to fix....

Scenes are largely separate entities within Blender, Scene Strip hack
notwithstanding.

> Once this is fixed, then it might be nice to construct an overall
composite
> into the VSE scene, perhaps just changing color-tone for the entire movie.
> I don't see any reason to do this in each individual scene's composing
> setup.

Traditionally, grading happens after the final shots are finished and fully
conformed, on a shot by shot basis. Each shot may be a series of extremely
complex grading nodes / tools / techniques.

> Does the VSE need to pre-render proxies for the scenes to handle this?

Why?

Editorial is separate from compositing / heavy lifting as Mr. Gumster has
highlighted above.

> Does
> it need to downscale the compositing effects?

No, because your picture is already cut and picture locked. Editorial has
already completed.

> The ability to approach real time interactivity through pre-rendering and
> preview-degredation is exactly what NLEs do in their compositing/effect
> pipelines. There is no "magic" solution to pushing pixels. There are
> situations where they can't push pixels in real time, and it requires
> low-resolution preview and/or pre-rendering. This seems very much the same
> as Nuke proxies.

But this appears to confuse the fact that this is a relatively unworkable
process in even an intermediate project.

Nuke is a shot by shot tool. The proxies are utilized to speed up the work
on a single shot that has already been blocked in.

> I also think that,
> in general, Blender capabilities often lag behind the state-of-the-art by
a
> delay which is dwarfed by Moore's Law. Applied to above example, I mean
> that by the time Blender has the capabilities of Nuke and Maya, Moore's
law
> may allow real-time 4k compositing on a wristwatch. (okay, that's a joke,
> but I'm trying to make a point).

Again, only true if the estimations are based on current needs. The reality
is that needs adjust with the available power, and as such the carrot
consistently remains just beyond the reach of the donkey cart.

See the deep compositing for a good example of CPU draining contemporary
needs.

The traditional pipeline exists for good reason, and perhaps others can
chime in with their experiences on visual effects / animation projects to
highlight the realities of even smaller project complexities.

> However, I don't see why a single
> pipeline can't function as both an *approaching* real-time preview
> 3d+compositing pipeline, and an *approaching* real-time NLE effect and
> compositing pipeline.

> If there are real technical reasons this approach can run into problems,
> I'd like to understand what they are. However, "this is not the way things
> are done" doesn't really mean anything to me. Blender and open-source 3d
> wasn't the way things were done 10 years ago.

Do the math and you can begin to see why.

Ten years ago, 16 bit float was bleeding edge. Today it is 32 bit float.
Tomorrow it may very likely be 64 bit float.

Ten years ago perhaps ten passes were used in a render. Today twenty.
Tomorrow perhaps forty.

If the targets were fixed, we would easily be able to do all of this
realtime. Of course, that also neglects the role of editorial in a project
- sorting out pacing, beats, cuts, etc.

If we visualize a ludicrous example such as any mainstream visual effects
movie or such, you can hopefully begin to see how grossly inadequate FCPX
or Avid etc. fail miserably as an all-in-one solution. They are merely used
for blocking in elements, exporting a text file, and the hard work is done
by a plethora of other tools.

Don't read this as a dissuasion from exploring how to more greatly
streamline Blender for efficiency. It is merely a caution to keep an eye on
the macro level design considerations.

With respect,
TJS


More information about the Bf-funboard mailing list