[Bf-committers] Colour Managment

François T. francoistarlier at gmail.com
Sat Apr 5 15:03:09 CEST 2014


>
> He raised the point that “filmers prefer Aces but animation/vfx pipelines
> (apparently) not. Blender is about the latter”. Our understanding is that
> ACES is extremely beneficial for visual effects pipelines because they
> mix so many different sources of images.


​That's not entirely true, but not false as well. In fact in VFX, you
usually don't have to mess with it because the Edit department took care of
it and you only end up with linear footages. Only the edit and colorist
have to mess with it. We sometimes have a few LUTs to simulate the output,
but that's all. So without knowing it Animation/VFX deals with it at some
point, but by some specific department. What I mean is that, it is done at
some point in the pipeline.


F.​


2014-04-04 17:04 GMT+02:00 Tom Wilshaw <tom.wilshaw at btinternet.com>:

> What we want to discuss is the
> possibility of further integration of the ACES/IIF colour management system
> into Blender. We don't know how familiar you are with this system so we
> apologise if any of this seems patronising. Conversely, if any of this is
> unfamiliar we would refer you to the Wikipedia page (
> http://en.wikipedia.org/wiki/Academy_Color_Encoding_System)
> and more importantly the ACES documentation (
> https://www.dropbox.com/sh/nt9z9m6utzvkc5m/ebopy8K7Y6).
> The document “ACES_v1.0.1” which is an overview of the system.
>
> Currently, if importing footage or
> images from a wide gamut source into Blender, it is necessary to convert
> it to
> the native sRGB working space. This can be done on scene linear data with
> a 3x3
> matrix, but doing so causes those RGB values which fall outside the sRGB
> gamut
> to be represented with one or more negative RGB values; this can present
> problems for some compositing operations. The ACES gamut can accommodate
> all
> current and likely future gamuts without negative values because it
> encompasses
> the entire visible spectrum. Of course, sRGB material will fit into this
> gamut
> fine, so no compatibility issues should arise due to using existing sRGB
> material.
>
> If Blender's native working space
> were ACES, then we could create a camera to ACES matrix from the Macbeth
> chart,
> mix with other sources including CG renders with a wide gamut, and output
> to
> any display type we wanted. All of this without having to contend with
> compositing images with negative RGB values.
>
> We would be interested to know how
> these issues were addressed on Project Mango with the Sony F65, from
> Sebastian
> Koenig's post it sounds as if the gamut was clipped or negative values were
> accepted, but we could have misinterpreted this.
>
> ACES also allows the creation of
> trim passes for output displays other that the mastering display without
> additional colour correction by employing a variety of ODTs, although
> Blender's
> current implementation addresses this. However, for wider gamut releases,
> such
> as DCPs, it would be better to have a wider gamut source than sRGB, to
> better
> fill the range of available colours and give a richer and more realistic
> result. This is one instance where a move to ACES would be beneficial to
> content created entirely within Blender, rather than live action footage
> only.
>
> Could a move to ACES be affected
> simply by changing the OCIO config, or would the tools in the compositor
> require some recalibration? I suppose changes would have to be made to
> render
> engines, or would just giving ACES texture images and colour pickers give
> ACES
> renders? The ACES RICD flare component would have to be added if it isn't
> currently. Prior to the start of the next open movie seems like a great
> time to
> fully switch to ACES; not only would it make compatibility between studios
> easier and help different programs to share images (both are amongst the
> main
> aims of the ACES initiative), it would also allow the creation of a wider
> gamut
> DCP.
>
> Although one of us is capable of
> some coding, a project like this is currently beyond our ability, however
> we
> would be very happy to contribute to documentation for users, as well as
> help
> in any way we could with testing.
>
> Another aspect of this system is the
> IDT. Whilst the creation of IDTs according to the ACES specification would
> require the integration of the CTL, as well as a complex experimental set
> up
> with the camera in question, an approximation which meets the required
> behaviours
> listed in the Academy's IDT specification can be achieved with a
> linearization
> LUT and a matrix. We have created a program to derive the matrix, although
> this
> is a work in progress and could be greatly improved. There are many
> academic
> papers on the creation of linearization LUTs, although we have yet to
> implement
> one, as we have taken this information from a DNG tag for our own camera.
> For
> high end cinema cameras, the manufacturers provide IDTs in CTL, but these
> can
> be converted to OCIO LUTs and matrices.
>
> We sent this to Ton, who advised us
> to send it to the mailing list. He raised the point that “filmers prefer
> Aces,
> but animation/vfx pipelines (apparently) not. Blender is about the
> latter”. Our
> understanding is that ACES is extremely beneficial for visual effects
> pipelines
> because they mix so many different sources of images. For example live
> action
> footage from film scans or cinema cameras, background plates and textures
> from
> DSLRs, CG renders and matte paintings. The differences between these
> sources
> are minimised in an ACES workflow. For animation ACES could improve the
> finished result by providing a wider range of colours, even when mapped to
> a
> low gamut output by the ODT, and by better handling high dynamic range
> renders
> through the RRT’s pleasing tone rendering. In many ways, the change to an
> ACES
> workflow would be largely invisible to an end user.
>
> As an aside, but whilst discussing
> colour management, we have often thought that the addition of an OCIO node
> would be a very helpful feature. Some operations, such as green screen
> keying,
> work better on low dynamic range sources, and the ability to convert to
> log and
> back within a node tree would be extremely helpful (see Jeremy Selan's
> "Cinematic Color", p. 35).
>
> Owain and Tom Wilshaw
>
> _______________________________________________
> Bf-committers mailing list
> Bf-committers at blender.org
> http://lists.blender.org/mailman/listinfo/bf-committers
>



-- 
____________________
François Tarlier
www.francois-tarlier.com
www.linkedin.com/in/francoistarlier


More information about the Bf-committers mailing list