[Bf-committers] VSE Strip-Wise Rendering

Peter Schlaile peter at schlaile.de
Tue Sep 28 20:39:43 CEST 2010


Hi Leo,

> Looking at the code for the VSE it appears solid, but not very modular,
> nor suitable for effects that need access to more than the current
> frame. Since the tools I have fall into that category ? the anti-shake,
> for example, needs to compute the optical flow for each pair of frames ?
> it is currently near-impossible to port them over in a way that would
> give a good user experience or remain modular enough to be maintainable.

Problem is: the idea behind the VSE is, that it should try to do most / 
all things in realtime.

That doesn't alter the fact, that we need optical flow, so my idea was:
add a optical flow builder, similar to the proxy builder in 2.49 and link 
the generated optical flow files to the strips.

That makes it possible to:

a) use optical flow files generated by other software (like icarus
    tracker)
b) use optical flow information from scene files or even Open EXR-files
    (I'd think, the vector pass together with the Z-pass could be used for
    that)
c) let the optical flow information be calculated in the background,
    when none is available and reuse it later for realtime display.

>    for each frame:
>        for each strip:
>            render
>        composite
>
> gets turned into:
>
>    for each strip:
>        for each frame:
>            render
>    composite

I don't really know, how you want to do that in realtime. But maybe I got 
you wrong.

If you want to display one arbitrary frame in the middle of a Sequencer 
Editing, what exactly does your code actually do?

My understanding of your idea is currently: I'd have to render everything 
from the beginning and that sounds, uhm, sloooow? :)

> This way, we could do frame rate conversion naturally. We could do
> speedup/slowdown, interpolation, anti-shake, and everything easily.
> Effects that only require access to the current frame would still work
> as a kernel inside a strip.

Since the common base here is optical flow, I'd think, it is better, to 
generate optical flow files and use them with the current design.

Anti-Shake or motion tracking sound like tools, that should run within a 
seperate background rendering process. We could add something to the 
interface, that enables an effect track to have a custom render/bake run. 
Like: please render/bake motion tracking data into fcurves (which will 
feed the entire strip into the effect bake function only once and we use the 
fcurves later for actual frame translation and rotation.).

Since I have to rewrite proxy rendering for 2.5 anyways, we could add 
something like that, too. (The 2.49 proxy builder didn't run in background 
and was more or less a hack.)

Regarding the tools you have written, do you thing, that adding per effect 
strip render/bake would solve your problems? (It could be done in such a 
way, that the bake function could request arbitrary frames from it's input 
track.)

Cheers,
Peter

--
Peter Schlaile


More information about the Bf-committers mailing list