[Bf-viewport] Color Management in the Viewport and UI in general

Lukas Stockner lukas.stockner at freenet.de
Sun Oct 9 20:22:05 CEST 2016


Yes, that sounds reasonable.

To get correct blending, the 8-bit buffers have to be encoded with the sRGB transfer curve (aka gamma).
So, the color conversion pipeline would be:

UI stuff: Linear colors -> sRGB transfer curve -> 8-bit sRGB buffer
HDR stuff: Linear colors -> 16-bit half buffer

Final compositing: (8-bit sRGB buffers -> Inverse sRGB transfer curve) mixed with 16-bit half buffers -> OCIO display transform -> 10-bit final window buffer

This will produce correct results since the colors are correctly linearized for blending in the 8-bit buffers - the only downside is some slight precision loss when the transfer curve of the display transform is significantly different from sRGB, but well - the color-critical stuff is stored as linear half anyways.

In fact, the existence of a intermediate Window buffer makes it *a lot* easier - the code can just always write either linear or sRGB-transfer-curve-encoded values, and the display-specific part is handled in one place. That way, we can even keep caching the transformed colors in e.g. Images, since the intermediate encoding always is the same.

Awesome!

Am 09.10.2016 um 19:22 schrieb Mike Erwin:
> Thanks Lukas for bringing all this up.
> 
> Some quick notes:
> 
> With OpenGL we can render to 8-bits-per-component sRGB and sRGB + alpha buffers internally for things that don't need deep/HDR color. I'm thinking tasks that don't involve real materials or scene lights. Also UI element/widget overlays. That gives us more mileage out of each 8-bit component. OpenGL automatically handles the conversions to & from linear when we read/write/blend.
> 
> Tasks that do need HDR color will be rendered to float16 (aka half) RGBA buffers. These are linear values and don't need special treatment.
> 
> My thought is to composite all these into the window-sized 10.10.10 RGB color buffer. Do 10-bit components need to be stored sRGB also, or is that overkill? I still need to research whether the APIs offer 10-bit sRGB buffers.
> 
> Display color transform can be part of this final compositing, the last step. I strongly prefer that we do it all in one place so other shaders / parts of the code don't have to handle this.
> 
> We do have to be more careful about color & image inputs. Most come from a known color space, or can be assumed sRGB.
> 
> Good point about multiple monitors with different capabilities. That's something we should aim for.
> 
> Mike Erwin
> musician, naturalist, pixel pusher, hacker extraordinaire
> 
> On Sun, Oct 9, 2016 at 11:32 AM, Lukas Stockner <lukas.stockner at freenet.de <mailto:lukas.stockner at freenet.de>> wrote:
> 
>     Hi all!
> 
>     Since the Viewport and generally all areas concerned with OpenGL are being overhauled for 2.8, this would be a great time to also think about incorporating Color Management into all UI aspects.
> 
>     First of all, a summary of what's the issue:
>     OpenGL itself doesn't care about Color Management - the software writes a RGB triple, and it will be sent straight to the screen. Ideally, the same triple would always give you the same color.
>     However, that's not true for two reasons - first of all, screens aren't exact, every model and even every individual screen will deviate from the standard a bit.
>     Also, and even more importantly, there isn't just "the" standard - most screens (try to) conform with sRGB, and that's what Blender currently assumes. However, especially for professional graphics work, screens with a wider color space are often used. Especially now that new iMacs come with DCI-P3 screens, it's important to support that - to use an extreme example, even if the screen swaps green and blue, it has to be possible to configure Blender so that it looks as always. Of course that's not an issue that comes up in practise, but it helps to emphasize that CM really must be applied to everything that's drawn - if swapping channels isn't possible, then correct CM isn't possible.
> 
>     The current state:
>     Blender uses OpenColorIO (OCIO) for color management. The user can select a display color space in the Scene options. However, that is only used for a few specific cases - for example, Cycles viewport rendering and the Image Viewer apply the correct transform, but the Viewport and UI elements don't. Also, the display transform is used for some things where it shouldn't be used, for example, to convert color values into Hex format in the color picker.
>     Another issue is that there is only one setting for all of Blender - when the user has e.g. a expensive wide-gamut monitor for color-critical work and a normal sRGB one for other things like modelling, there's no way to get correct colors on both, and the UI colors won't be consistent between them.
> 
>     What should be done:
>     - First of all, it needs to be clear for every color value in which color space it is. Most colors are stored as float arrays, and for those the OCIO reference space is the obvious choice. For colors where storage size matters (for example, Images or Vertex Colors), precision becomes a concern (storing linear values in a 24-bit color isn't the best ides), so it must at least be clear from the context which space is used. For Images that already is the case, but for others it isn't.
>     - Then, the user must be able to select a different Display Color Space for every Blender Window in order to support multi-display configurations. This also means that caching the transformed values, like it's currently done for Images, is tricky because they might be different for every window.
>     - Finally, every color that is drawn to the screen has to go through a OCIO transform from the space in which it's stored to the correct display space. For this, there are two options: The first is a regular C-based API that maps colors, which is fast enough for individual colors like UI elements. The second option is generating a GLSL snippet and a 3D LUT texture that allow to apply the transformation from within GLSL shaders, which should be used for things like Textures (it's already used for the Cycles Viewport, for example).
> 
>     The third point is the reason why I bring this up now - since all the drawing code is being modernized now anyways, this is the perfect opportunity to introduce the proper conversions.
> 
>     Now, I don't really have that much experience with the UI and drawing code of Blender (yet), which is why I don't include a particular implementation suggestion in this mail. However, I'd be interested in helping out with CM, and hope that people with more insight into the overall code layout can suggest how and where to actually implement that on the UI side :)
> 
> 
>     _______________________________________________
>     Bf-viewport mailing list
>     Bf-viewport at blender.org <mailto:Bf-viewport at blender.org>
>     https://lists.blender.org/mailman/listinfo/bf-viewport <https://lists.blender.org/mailman/listinfo/bf-viewport>
> 
> 
> 
> 
> _______________________________________________
> Bf-viewport mailing list
> Bf-viewport at blender.org
> https://lists.blender.org/mailman/listinfo/bf-viewport
> 

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 836 bytes
Desc: OpenPGP digital signature
Url : http://lists.blender.org/pipermail/bf-viewport/attachments/20161009/470aaef8/attachment.pgp 


More information about the Bf-viewport mailing list