[Bf-vfx] Formats and Color

Troy Sobotka troy.sobotka at gmail.com
Sun Apr 8 21:21:51 CEST 2012


On Apr 8, 2012 11:40 AM, "Peter Schlaile" <peter at schlaile.de> wrote:
> > Furthermore, the color management system needs to be able to chain
> > arbitrary transforms at any stage of the process.
>
> sure, should be easy to add.

OCIO provides for chains.

>
> > What about the internal working space of Blender? Modern motion
> > picture cameras have gamuts that greatly exceed the SRGB primaries
> > enforced by the image loading code.
>
> but shouldn't Blender actually just work in float space, only clamping the
> value at the final rendering step and everything should be fine
> and dandy? (still scratching my head on this one)

The internal 32bpc format is not the issue.

The issue is that many seem to think that defining a sufficiently deep
linear system is enough. It is not.

The internal space must be mapped to a known model so that all input,
output, and manipulative transforms can be done within the model.

A 709 or digital projection transform would be done using a LUT, as opposed
to assuming certain traits of the values. An example would be the decompose
nodes that must make assumptions of the values as sRGB in order to work.

The only way around this is to define the internal model's space.

Again, OCIO accounts for this, but all Blender developers will need to be
aware that despite being linear, multiple color spaces are possible and
that assumptions should not be made as they have in the past.

> Half agreed :) I read a little bit more in OCIO documentation. It really
> looks pretty nice. Probably we can replace all the swscaler mumbo jumbo
code
> with OCIO.

It would still require the coefficient transforms in order to get to as
close a scene referred RGB triplet as a starting point from the codec
format. Relevant for YCbCr transforms.

> Still, I think that main blender should have to deal with either RGBA byte

Byte is frustratingly worthless for anything beyond a performance gain for
the limited real time proxy views.

Rounding errors being the tip of that iceberg, let alone fidelity of LUT
transforms etc.

> To my knowledge, you can't do proper white balance after the debayering
step,
> since you loose necessary information on the way (it's not exactly easy to
> reconstruct the original bayer matrix...).

In the end, properly collapsing the raw steel data to a high fidelity RGB
triplet still permits white balancing.

Nuke and other software does this, for example. ACES also has white point
transforms in the 3D LUTs.

Mr. Selan can iterate on the mathematics of this if asked.

> That's the one which isn't exactly BETA. I haven't tried it, but regarding
> his own documentation, it seems to be more in a work in progress state.

Single step transforms currently work in the branch. The F65 LUTs are
predicated on an internal ACES model however.

> I'm still wondering where he got all that noise from. (ISO 800 doesn't
look
> exactly like a very large value...)
>

If there was an error in LUT transforms, the discrepancy between the
resultant pixel values may appear as noise.

The original DPX of the car, properly transformed, looked acceptable.

With respect,
TJS
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.blender.org/pipermail/bf-vfx/attachments/20120408/015c894a/attachment.htm 


More information about the Bf-vfx mailing list