[Bf-vfx] Formats and Color

Andrew Hunter andrew at aehunter.net
Mon Apr 9 18:27:57 CEST 2012


Hey Peter,

On Mon, Apr 9, 2012 at 4:59 AM, Peter Schlaile <peter at schlaile.de> wrote:
> Hi Troy,
>
>> > but shouldn't Blender actually just work in float space, only
> clamping the
>> > value at the final rendering step and everything should be fine
>> > and dandy? (still scratching my head on this one)
>> The internal 32bpc format is not the issue.
>> The issue is that many seem to think that defining a sufficiently deep
>> linear system is enough. It is not.
>> The internal space must be mapped to a known model so that all input,
>> output, and manipulative transforms can be done within the model.
>
> okay, I wasn't very precise in my question.
>
> So: why is it exactly a problem to keep the workspace always in Linear RGB
> granted, that we have a float precision pipeline that doesn't clip?

To be clear, Linear RGB is only two thirds of the information you need
to define a color space. Blender defines the representation of the
code values (float RGB) and the gamma (linear) but not the primaries
which map it to actual wavelengths of light.

To quote Troy, Blender is ambiguous about that, which is not
inherently a bad thing. Applications like Nuke are also explicitly
ambiguous about primaries. The reason though is that a compositor only
cares about matching the elements to the plates. The in-app display is
not considered representative of the final image. Those decisions and
requirements are deferred to color grading applications.

However, if you wish to finish an image, you need a complete color
space for consistency across displays. Take for example a saturation
operation. The co-efficents required for the math is tied to the color
space primaries. It is far simpler to have the internal working space
of a program explicitly defined and all operations work (and all
inputs map to and outputs map from) on that space than to have
compensate for if your image has SRGB or AdobeRGB or S-Gamut or
<insert colorspace here> primaries on every function that needs that
information.

> Or: what is the exact reason to use a different workspace color profile?
> (besides nodes that clip values (which they shouldn't IMHO) and speed
> reasons if for example input is YUV, transformations can be done in
> YUV and output is YUV).

Accounting for differences between input devices (Red Epic vs F65 vs
Alexa) needs something to compare against but having an unclipable
colorspace is the primary reason.

> My suggestion is: in a first step, keep the internal workspace of the
> float pipeline in Linear RGB.

I definitely agree with you on that. What I am saying is the internal
workspace definition is incomplete.

> In a second step, we can make the whole pipeline color management aware, but
> that is really a lot of work. And: I still can't see the benefit over
> a truely float *and* HDR aware RGB pipeline.

I am confused as to what you are saying here.

> Fun fact: the idea behind ACES looks like this: have the workspace *always*
> nailed to ACES colorspace. To finally end this workspace colorspace craziness
> and just work within the one and only workspace called ACES.
> In fact, the same thing, Blender already did. With a RGB float space,
> that shouldn't clip.

You are quite right about the intentions of ACES. However, see my
comments above about Blender's incomplete working space definition.

> So: other direction could be: let's nail blender to ACES color space and
> everything is fine :) Or just keep it the way it is, but do that properly.
>
> But since the transform ACES <-> RGB seems to be a simple matrix multiplication,
> I still can't see the benefit, sorry.

ACES uses float RGB code values to represent colors. Keep in mind, RGB
is not a color space by itself.

[...]

Sincerely,

Andrew


More information about the Bf-vfx mailing list