[Bf-vfx] Formats and Color

Ton Roosendaal ton at blender.org
Mon Apr 9 14:45:22 CEST 2012


Hi Andrew,

You jump to conclusions on issues too quick:

>>> comments on the mango blog about how the author of the DPX code didn't know
>>> that there were more than one log curve that could be used shows the folly
>>> of embeding that transform at that level. If something isn't correct, there
>>> is no way for the user to adjust.


- He is not the author of the DPX code in blender, but a volunteer fixing up errors in DPX code.

- By design, Blender demands that all float buffers internally are linear. The code to read/write dpx should keep the log or any other curves locally only on file I/O level.

- The DPX test by Sebastian was only because the Sony software has OpenEXR and DPX output from raw. We never intended to use this in the pipeline.

> Loading EXR files clamps the range to 0-1, which if the file is
> scene-linear, throws out much of the data. Further more, many of the
> image nodes clamp to the same 0-1 range.

- Blender allows exr to save and read out all values betwen plus and minus infinity.

- Composite nodes also allow unlimited RGB ranges, with only exceptions where needed (prevent negative blur size, color range mapping, etc).

For the rest of it, I have the impression it's all being made too complicated. Let's try to stick to design ideas for an efficient workflow for Blender, which is for 3d artists (who are not color experts) specifically. I am sure we can keep it nice simple, and still satisfy strict color quality demands :)

-Ton-

------------------------------------------------------------------------
Ton Roosendaal  Blender Foundation   ton at blender.org    www.blender.org
Blender Institute   Entrepotdok 57A  1018AD Amsterdam   The Netherlands

On 8 Apr, 2012, at 18:32, Andrew Hunter wrote:

> Hey Peter,
> 
> On Sun, Apr 8, 2012 at 9:45 AM, Peter Schlaile <peter at schlaile.de> wrote:
>> Hi Andrew,
>> 
>> I have problems to understand some of your text:
> 
> Hopefully I can make myself clearer :)
> 
>>> Perhaps the greatest challenge to supporting ACES in Blender is the need to
>>> excise most of the image loading and color related code.
>> 
>> I see three points, where we have to add color management stuff:
>> 
>> a) at image/movie load time
>>   for the 8 bit pipeline it should integrate with swscaler/ffmpeg
>>   and should be changeable by the user.
> 
> Call me crazy but I feel that Blender should only accept image
> sequences rather than movies. That ideological but it simplifies many
> things.
> 
>> b) at display time
>>   preview displays should be able to switch between output profiles.
>> 
>> c) at render output time
>>   again: configurable for the user.
> 
> Furthermore, the color management system needs to be able to chain
> arbitrary transforms at any stage of the process.
> 
>> 
>> It doesn't look like a really big change to me. Since I'm currently in
>> the need of adding technicolor cinestyle support for my Canon EOS 5D,
>> does anyone mind, if I start coding, starting with xat's branch, which
>> seems to use OCIO?
> 
> What about the internal working space of Blender? Modern motion
> picture cameras have gamuts that greatly exceed the SRGB primaries
> enforced by the image loading code.
> 
>>> Looking at the
>>> comments on the mango blog about how the author of the DPX code didn't know
>>> that there were more than one log curve that could be used shows the folly
>>> of embeding that transform at that level. If something isn't correct, there
>>> is no way for the user to adjust.
>> 
>> I don't think, bashing developers helps a lot. What you should keep in
>> mind: color space transformations can be done very fast on input data,
>> but are pretty slow on float, so the best place is on file load using
>> look up tables.
> 
> I wasn't trying to bash the developer, rather illustrate the mistake
> in that approach. Once that transform is baked into the data going
> into blender, there is little a user can do to correct for it.
> 
> What I am suggesting is that the image loading code should just serve
> the data. Nothing more :)
> 
> It is admittedly biased, it is the approach that OpenImageIO takes.
> The artist is far more knowledgeable of the context of image
> production that the developer. One should enable the other.
> 
>> At least, for a CPU implementation. For GPU, it doesn't really matter.
> 
> OpenColorIO works on both the CPU and the GPU.
> 
>> That said: sure, the user should be able to control, which profile is in
>> use.
> 
> I'm glad we agree on that.
> 
>>> With the current incomplete colour management system, blender's dicotomy of
>>> SRGB or srgb primaries with linear gamma concept would need to be removed.
>>> Also, ACES explicitly requires the support for HDR values (the +- 16 bit
>>> floating point range) internally and only clamps to a 0-1 range using the
>>> RRT.
>> 
>> The float pipeline already has that feature. Correct me if I'm wrong.
> 
> Loading EXR files clamps the range to 0-1, which if the file is
> scene-linear, throws out much of the data. Further more, many of the
> image nodes clamp to the same 0-1 range.
> 
>>> As for reading F65 Raw, only applications that use the Sony SDK (like
>>> Davinci Resolve) will support the raw files natively.
>> 
>> <troll>
>> Hmm, well, sure. Only applications that have support for a certain
>> format can read it.
>> </troll>
> 
> Ton asked if OCIO some how worked on raw data. I was aswering. On a
> related note, there has been a proposal on the OIIO mailing list about
> a GSOC project to implement raw support (via libraw).
> 
>>> In, Raw is inappropriate for heavy VFX work, I will elaborate on the
>>> reasons why later in the mail. It is however useful for colour grading.
>> 
>> Reading the rest of your post, I still haven't understood, why "Raw is
>> inappropriate for heavy VFX work".
> 
> It has to do with what a "raw" image intrinsicly is.
> 
>> I do know, that most studios don't use RAW data for that, but that has
>> to do with decoding speed to my knowledge. (It's simply faster to do all
>> the grading work on HD proxies and use the RAW data in the conforming
>> step at the end.)
> 
> It also has to do with raw isn't a complete image. Say you want to
> composite an alpha over. You need to do all the raw processing steps
> beforehand to get data that makes sense in that context (RGB values).
> I would state that data storage is significantly cheaper than
> equivalent processing time. So it makes sense to bake in those
> decisions, like white balance and denoising,  then do your
> compositing.
> 
> All of my experiences as a client in colour grading sessions has had
> us working at full resolution. Watching Baselight render complex
> transforms as well as tracked secondary corrections at 4k in real time
> is a marvellous thing.
> 
>> In fact, regarding the data size we are talking about, I think, direct
>> support for F65 RAW files is a pretty clever thing.
>> 
>> Especially, since you can do certain color transformations on RAW files,
>> which aren't that easy/possible after the debayering step. (White
>> balance comes to mind.)
> 
> This is why colour grading applications are able to ingest raw footage
> but they do not do the data manipulation directly on the "raw" data.
> They work on the cached output of the developed data.
> 
>>>> What software is currently using opencolorIO with f65 support?
>>>> Did you try to read in the data we posted and process it?
>>> 
>>> Nuke and the rest of the foundry apps. Beta support for After Effects and
>>> Blender :)
>> 
>> Beta support on Blender?
> 
> Xat has a branch of blender with OCIO integrated. I linked to it in an
> earlier message.
> 
>>> Raw is raw :)
>> 
>> Ouch. Read my other post on that one.
> 
> You misunderstand me. That was in context to the noise comment by Ton.
> 
>>> That all the secret sauce that camera manufacturers used to
>>> put in hardware to hid as much of that as possible is gone. Your seeing
>>> what the sensor sees and sometimes that means lots of noise. It would be
>>> worth looking into what processing you might have to do prior to the
>>> exporting of the OpenEXR files.
>> 
>> What you see is, sensor data (which differs between cameras) piped
>> through a compression scheme (which isn't lossless at least for RED
>> cameras), cooked up by debayering, which adds some / a lot of softness,
>> if it doesn't do it's job correctly / has been configured the wrong way.
> 
> Could you elaborate on what you mean here? Are you alluding to a
> connection between perceptual sharpness and noise?
> 
>>> On top of that, many camera manufacturers apply some form of compression to
>>> the raw frame to further reduce the data rate. Red, for example, uses a
>>> form of wavelette compression similar to jpeg2000. I am not certain if Sony
>>> employes their own compression scheme.
>> 
>> They do :)
>> 
>>> As to why I think raw isn't suitable for heavy vfx, the advantages derived
>>> from the above features, namely flexibility and parameters that arn't
>>> baked, can be a hinderance.
>> 
>> Why?
> 
> See the above comment about disk space and computing time.
> 
> [...]
> 
> Cheers,
> 
> Andrew
> _______________________________________________
> Bf-vfx mailing list
> Bf-vfx at blender.org
> http://lists.blender.org/mailman/listinfo/bf-vfx



More information about the Bf-vfx mailing list