[Bf-committers] Colour Managment

Tom Wilshaw tom.wilshaw at btinternet.com
Mon Apr 7 18:06:35 CEST 2014


Thank
you both for your comments; they have given us a lot to think about.
>Wider
gamuts typically will require the clamp not only at the zero 
>luminance level, but also at the upper end for values that have been pushed 
>beyond their sRGB domain.
The
problem with clamping is that it is an awful waste of the wide gamut source
material. If only minimal colour grading is required, there is no harm in
allowing the out of gamut values; the EXR file format, and we think Blender’s
32 bit float processing, allow for negative RGB values, and if negative values
are retained then positive excursions beyond sRGB are also possible without
skewing the colour. Of course, these values are clipped on the display without
an ODT or some form of gamut mapping.
It
just makes processing easier for more complex grading and effects if the gamut
is wider than, or at least as wide as, the source. We hope we all agree that
having a wide gamut working space, whether ACES based or not, would be an
improvement for Blender.
 >Or
XYZ.
We
think that, from a technical viewpoint, XYZ would be an excellent solution. The
main problem we see is that, with the exception of Y, it is not intuitive to
artists. “A small adjustment to the Z channel” just doesn’t convey the same
sort of readily interpretable information that one gets in RGB. We suppose the
tools could work in RGB, but process in XYZ, but this will add extra processing
and complexity, along with a (perhaps insignificant) performance change.
>As
stated above, the conversion also yields invalid colors at the upper end 
>as well I believe. 

>The only method that I am aware of to maintain in-gamut values is to: 

>A) Linearize your space's data. 
>B) Scale data to 0-1.0 domain. 
>C) Matrix convert to smaller gamut via the aforementioned RGB to XYZ and 
>XYZ to RGB. 
>D) Clip out of gamut beyond 0-1.0. 
>E) Scale back to the scene referred values via the value obtained in B.
Our
current approach is to:
	1. Linearise
(we got a 1D LUT from DNG EXIF tag, it just needed scaling).
	2. Matrix
convert, without any clipping.
A
matrix can be derived by shooting a Macbeth chart and comparing the RGB values
this gives to the ideal RGB values for the colour space one is converting to.
Ideally, this is all done with great precision, having first measured the exact
spectral reflectance of your Macbeth chart, and derived the ideal values from
the spectral response of an ideal camera for the target colour space (e.g. the
ACES RICD). A precisely calibrated light source is also used. In our case,
tungsten lighting and tungsten white balance. We used published data for the
Macbeth chart sRGB code values. The results of applying the linearization and
matrix on input through our OCIO config, and then using sRGB and RRT on output
gives a very similar appearance to the camera manufacturer’s 3D LUT for
conversion to REC.709. There are some small issues due to experimental error,
which we are working on.
If
we run our cameras primaries through our LUT/matrix, we get the following floating
point values:
Red
primary R: 8.43412 G: -1.77726 B:  -1.85315
G
primary R: -1.33754 G: 6.84688 B: -4.40338
B
primary R: -2.65493 G: -0.61700 B: 10.77754
White
(camera clip point) R: 4.44056 G: 4.45152 B: 4.51781
As
you can see, none of this is clipped, but it can provide processing problems.
None of these numbers would need to be negative if ACES, or XYZ, were used.
When we render with the RRT all of these values get gamut mapped in in a very
pleasing way.
If
it is of interest we could provide some shots to look at, as well as our
LUT/matrix and our OCIO config. We could also make the Python program used to
derive the matrix available.
>The
current issue with ACES is that there was an aesthetic / design choice 
>baked in for film compatibility. This makes it difficult to ingest say, 
>sRGB and get 1:1 back via the RRTs. Using only the primaries of course 
>would deviate significantly from the ACES spec, and probably lead to 
>problems.”
This
is probably the heart of the matter then. Aesthetically at least the ACES RRT
transform works very well for sources with high dynamic range and a wide gamut,
however we can see that this would be a problem for inputting sRGB sources. We
think that in this regard, Blender’s current approach is very sound. Why
couldn’t Blender use the ACES primaries, but keep the current options of
“default” and “RRT” output? This would just be an OCIO config change. Another
approach would be to incorporate an LMT for sRGB material, but I agree that
this would probably cause more problems than it solved.
I
know that using only the primaries from ACES would be a deviation from the
spec, but Blender could incorporate the ACES spec so that those who want to use
ACES can do so fully, whilst also allowing deviations for those who want a more
traditional video style workflow. What problems would this cause?
>The
issue is that there are still some critical areas that are not color 
>managed, such as the color picker / wheel. Having the picker broken affects 
>even sRGB of course, but is much more obvious in wider gamuts. 

>Hopefully the picker / wheel and remaining pieces will be addressed in the 
>near future.
Personally
this has never given me major problems, as a lot of colour work is done by eye
or with the scopes, so the adjustments are subjective and relative. I don’t
know enough about the internal workings of these tools to really comment.
However, if Blender was to move to a wider gamut, would this need to be done
before setting up colour management for the colour wheel?
>ACES
by default would cause legacy problems here. XYZ however, would not.
Again,
would it be possible to use the ACES space, and then clip out of gamut or high
dynamic range stuff for sRGB or REC.709 outputs if people select “default”?
>hopefully
an OCIO node is soon on the horizon.
I’m
very glad to hear this.
>In
fact in VFX, you usually don't have to mess with it because the Edit department
took care of it and you only end up with linear footages”
I
think this varies quite a lot. Certainly in many low budget projects the editor
works with proxies, either with a 3D LUT or some baked in dailies colour
correction. An EDL is used to conform the camera original material for the
colourist or for effects. Although high end projects may get linear data
straight away by converting the camera raw files into EXRs, on smaller
productions it is very common for the camera original to be log ProRes or
similar. I don’t doubt that this is very variable depending on the project. In
any case, whether effects get linear footage or have to linearise it
themselves, they also generate material such as background plates themselves
with a variety of cameras, for example DSLR raw files.

How
was the F65 wide gamut footage handled on Tears
of Steel? Was the out of gamut stuff clipped off?
In
summary then:
	1. Do
we agree that having a wide gamut working space, ideally one that encompassed
the full visible range, would be good for Blender?
	2. If
so, and the choice is between XYZ and ACES, which way would you choose? The
actual gamut is the same, and ACES has the advantage of being an RGB encoding.
	3. Why
couldn’t the ACES space be used, but with more flexible input and output
options? The entire ACES spec could be encompassed within this larger system,
for full compatibility?
	4. The
film industry (live action and animated), as well as parts of the television
industry, are moving towards using ACES. It is a well designed system that is
about as future proof as such things can be. If we want Blender to see
increased use within these industries then we need the greatest compatibility
possible with other tools. I know Blender can do just about everything, but
those who use it in conjunction with other programs might find a colour
management system with wide compatibility helpful.
Thank
you for taking the time to discuss this with me, whether we agree or not it is
very interesting to hear the directions Blender might take, and the thinking
behind these decisions.
Tom
and Owain


________________________________
From: Troy Sobotka <troy.sobotka at gmail.com>
To: Tom Wilshaw <tom.wilshaw at btinternet.com> 
Cc: bf-blender developers <bf-committers at blender.org> 
Sent: Friday, 4 April 2014, 20:20
Subject: Re: [Bf-committers] Colour Managment




On Apr 4, 2014 8:07 AM, "Tom Wilshaw" <tom.wilshaw at btinternet.com> wrote:
> This can be done on scene linear data with a 3x3
> matrix, but doing so causes those RGB values which fall outside the sRGB gamut
> to be represented with one or more negative RGB values; this can present
> problems for some compositing operations. The ACES gamut can accommodate all
> current and likely future gamuts without negative values because it encompasses
> the entire visible spectrum. Of course, sRGB material will fit into this gamut
> fine, so no compatibility issues should arise due to using existing sRGB material. 
Wider gamuts typically will require the clamp not only at the zero luminance level, but also at the upper end for values that have been pushed beyond their sRGB domain. 
> If Blender's native working space
> were ACES, then we could create a camera to ACES matrix from the Macbeth chart,
> mix with other sources including CG renders with a wide gamut, and output to
> any display type we wanted. All of this without having to contend with
> compositing images with negative RGB values. 
Or XYZ. 
As stated above, the conversion also yields invalid colors at the upper end as well I believe. 
The only method that I am aware of to maintain in-gamut values is to: 
A) Linearize your space's data.
B) Scale data to 0-1.0 domain.
C) Matrix convert to smaller gamut via the aforementioned RGB to XYZ and XYZ to RGB.
D) Clip out of gamut beyond 0-1.0.
E) Scale back to display referred values via the value from B. 
> ACES also allows the creation of
> trim passes for output displays other that the mastering display without
> additional colour correction by employing a variety of ODTs, although Blender's
> current implementation addresses this. However, for wider gamut releases, such
> as DCPs, it would be better to have a wider gamut source than sRGB, to better
> fill the range of available colours and give a richer and more realistic
> result. This is one instance where a move to ACES would be beneficial to
> content created entirely within Blender, rather than live action footage only. 
The current issue with ACES is that there was an aesthetic / design choice baked in for film compatibility. This makes it difficult to ingest say, sRGB and get 1:1 back via the RRTs. Using only the primaries of course would deviate significantly from the ACES spec, and probably lead to problems. 
Within the Blender community, I believe this would cause more confusion than desired. 
Something like XYZ is likely a better fit, and is the 100% compatible with the existing Blender paradigm. 
Trim passes are already possible via OCIO. 
> Could a move to ACES be affected
> simply by changing the OCIO config, or would the tools in the compositor
> require some recalibration? 
You can obtain the official ACES OCIO configuration and use it. 
The issue is that there are still some critical areas that are not color managed, such as the color picker / wheel. Having the picker broken affects even sRGB of course, but is much more obvious in wider gamuts. 
Hopefully the picker / wheel and remaining pieces will be addressed in the near future. 
> We sent this to Ton, who advised us
> to send it to the mailing list. He raised the point that “filmers prefer Aces,
> but animation/vfx pipelines (apparently) not. Blender is about the latter”.  
With specific attention to the Blender community, it remains problematic due to the RRTs. Many animators within the Blender community expect a perfect 1:1 input to output chain, and this is currently impossible without the aesthetic choices ACES had baked into the RRTs. 
ACES by default would cause legacy problems here. XYZ however, would not. 
> As an aside, but whilst discussing
> colour management, we have often thought that the addition of an OCIO node
> would be a very helpful feature. Some operations, such as green screen keying,
> work better on low dynamic range sources, and the ability to convert to log. 
Agreed. 
In theory, this is already somewhat possible by crafting custom OCIO configs, but hopefully an OCIO node is soon on the horizon. It has been needed for a while. 
So the TL; DR is that ACES or any other space can be partially implemented in Blender by changing the OCIO configs. The downside is that some areas are still not color managed, with particular attention to the color wheel / picker, which requires a perceptual curve change (to log for example) and a primaries change. 
With respect,
TJS 


More information about the Bf-committers mailing list