So I've been put in charge of the color grading area of this forum... I only accepted because VK and I share a vision about the discourse of this particular portal. So let me "lay down the law" so to speak.
I believe in free speech, the open exchange of ideas and information. I believe in a free market, and competition. And I value those who are unafraid to challenge established methods and platforms; ColorGHear would not exist without these windows of freedom. I came out of the dark with a lot of crazy ideas and techniques that many scoffed at, but were again and again proven effective. As have others with their ideas.
That being said; the downside of free speech is that everyone believes their opinions and ideas are just as valid as everyone else's, and should carry equal weight. But there are such things as bad ideas and bad opinions. Add in trolls and those whose aim is not to add to the discussion, but to just fan the flames of discourse, and the result is a frustrating forum and a lot of bad information you have to waste time sifting through.
I hope to change that. I won't act against anyone just because they disagree with me, only if I feel that the posted information is counter productive and will erode the integrity of the discussion. In the past I have myself engaged in inflammatory posts in an effort to get people to really question things. I can't say I won't ever do that again. If you allow yourselves to get locked into one dimensional thinking, then I may become inspired to pry your minds open again. :) But for the most part, I'm passed that.
Let's try to make this category a valuable and reliable resource people can count on for solid info and techniques along with the crazy ideas that push the envelope. Just be able to back up your crazy idea(s) with some evidence that it is actually effective... or you'll likely meet with some pushback.
So here's a few things I think are important to focus on to keep the art of color grading moving forward:
Reliable and consistent ACES workflow ideas.
Reliable and consistent V-log and S-log (and any other log) Workflows.
Complete abandonment of outdated linear video capture for Log.... and accordingly; complete abandonment of platforms and capture systems that don't utilize LUT viewing technology.
Complete abandonment of 8 bpc and 16 bpc workspaces for 32bits per channel workspaces.
Abandonment of H264 codecs for H265 and better.
Abandonment of rec709 for rec2020.
Abandonment of 422 8bit capture and delivery, for 10bit 422 and better capture and 444 delivery.
I'd like to see the focus of this category pushed heavily into these areas, as they are the future of what we do with regards to color.
My reasoning for such high quality workflows is that you want to start your delivery with the highest quality file you can possibly produce and then downgrade only as necessary.
As materialist would say - We attempt to rise your opinion to knowledge, and not make topics that lower knowledge to opinion.
Grate thread!!!
Props to @Vitaliy_Kiselev and @shian for making this useful topic gain more interest and the deserved weight it has.
Color grading is very complex, but can be achieved in various forms for an easy work flow.
Complete abandonment of all color platforms that operate below a 32bit workspace
Could you elaborate on this please, do you mean the color depth?
Yes. 8bit and 16 bit workspaces (a la the default in After Effects) will not allow the full spectrum of color to come through.
FCP 7 is 8bit but you can force it to 10bit I believe Vegas and Premiere are now 32bit FCP X is 32bit Resolve has been 32bit for a while.
Don't have any data on others... so data from Avid users and others is welcome.
And speaking if AVID does anybody know if they've implemented LUTs into their platform, or plan to?
8bit and 16 bit workspaces (a la the default in After Effects) will not allow the full spectrum of color to come through.
Can you describe that you mean here?
the default color space on AE is 8bit, (I'd attach a pic but that option seems to be gone now.) Only in 32bit floating point can you access all the colors and gradients of gray needed to reproduce color accurately... otherwise you get banding etc where the color space has run out of colors.
Here's the unresolved issue with LUT-based color grading: There's no industry-wide platform-independent intermediate color grading workspace. While ACES could serve that role, it's far from universal. In practice, each commercial grading platform implements its own internal color workspace, requiring import LUT's tailored specifically for that application. To compound the inherent pitfalls, most users are unaware that a mathematically sound workflow requires the use of three different types of LUT's:
Camera-specific import LUT to convert capture profile (e.g. V-log L) to intermediate color workspace.
Platform-independent LUT's to apply esthetic grading effects within intermediate color workspace.
Display-referenced export LUT's to convert intermediate workspace to delivery format (e.g. Rec. 709).
Most commercially available LUT's import camera-specific footage into a proprietary workspace that the grading app renders to a Rec. 709 monitor, using its own built-in display LUT. This bypasses the platform-independent stage of grading, making it impossible to apply esthetic LUT's in a consistent, mathematically sound manner. Using anything more than a single import LUT in these circumstances produces results that are no less subjective than working directly with H.264 footage in a Rec. 709 workspace (e.g. Premiere Pro).
I don't see this situation changing much in the near future, for two reasons. First, the learning curve involved in mastering a professional color grading application locks you into that platform as the effective reference standard of your workflow. On top of that, the mystique around LOG and LUT-based post production techniques promotes fierce brand loyalty that undermines industry-wide standardization efforts. Imagine what video content delivery would be like if each H.264 codec were permitted to "interpret" the standard according to its developer's esthetic judgment. We would still be debating x264 versus MainConcept to this day.
the default color space on AE is 8bit, (I'd attach a pic but that option seems to be gone now.)
Color space has little to do with 8bit or 32bit. For example Rec 709 is defined by three primary colors (in CIE space or any analog) and few other things. You can use even 4bits :-), it'll be legit.
May be you understand something else meaning color space.
Only in 32bit floating point can you access all the colors and gradients of gray needed to reproduce color accurately... otherwise you get banding etc where the color space has run out of colors.
According to science 8bits are pretty enough for eyes. You can also add dithering to output if you like.
As far as I remember actual sensors do not produce anything above 16bits integers, even Alexa uses double gain 14bit ADC combined into 16bit result (and least 1-2 bits even with cool sensor are mostly noise). This happens just due to physics.
32bit floating point actually is useful for grading, multiple filters and adding CGI parts. Due to math.
otherwise you get banding etc where the color space has run out of colors.
Color space never run out of colors :-)
While ACES could serve that role, it's far from universal
I am no way master in this, but my understanding from reading ACES documents is that ACES is made as storage solution. Some solution where old scans and new films, including CGI can be stored and used later for updated releases or usage in other projects.
Color space made for everyday work must not use RGB based approach at its core.
But may be I get it wrong.
It goes back to the number of color variants available in a given space... old 256 color video cards and monitors could not display pictures accurately because they only had 256 colors to choose from. The next step up was 32k (I believe...hard to remember, but it took a while before we even got to millions of colors) and then we got 16 bit and then 32, and over 32 the human eye can no longer distinguish a difference. What I'm talking about lies somewhere in this vein. I do not completely understand the science.
When I would do elaborate gradients et al in AE in 8 bit, you could see banding, same with 16, but at 32... the banding was gone. THEN if I exported from 32 to anything other than 444 color, the banding would reappear... I don't know the science, I just know what happens in those spaces... it had nothing to do with a codec or compression, and everything to do with the number of colors available for display within each medium.
And ACES is not just a storage medium... it's something that will eventually change the way we do things with color... The goal of ACES is to somehow "deconstruct" the color spaces of each footage source and allow the user to "re-assemble" the footage in a unified way so that all different cameras match as if they were captured by the same camera.
I realize my description is probably not technically accurate, but it's put into laymen's terms as I understand it.
It goes back to the number of color variants available in a given space... old 256 color video cards and monitors could not display pictures accurately because they only had 256 colors to choose from. The next step up was 32k (I believe...hard to remember, but it took a while before we even got to millions of colors) and then we got 16 bit and then 32, and over 32 the human eye can no longer distinguish a difference. What I'm talking about lies somewhere in this vein. I do not completely understand the science.
Best not to call it color space in this case. Actually now for monitors we mostly have 8bit discretization for each of primary colors in RGB triad. 3 times of 8bits gives us 24bits, 32bits are either with alpha channel or just one dump byte to make it more easy for processor.
32bits you mention refer to 32bit single precision floating point used for each of R,G,B channels.
When I would do elaborate gradients et al in AE in 8 bit, you could see banding, same with 16, but at 32... the banding was gone.
It is slightly confusing.
Scientists are more simple, they just test gradients where they are sure in that they show to people.
If you get perfect 8bit gradient from file you have no idea that you see, as your software can do something, your GPU can do something (especially if you have ICC or such) and monitor also live with his own life.
THEN if I exported from 32 to anything other than 444 color, the banding would reappear... I don't know the science, I just know what happens in those spaces...
Human mind is very complex thing. For example you can see very fine changes on big even spaces like sky.
it had nothing to do with a codec or compression, and everything to do with the number of colors available for display within each medium
By idea it should do. As you already have some color processing going if you output 420 or 422. Next as you view such material you have color interpolation that can add their own issue. By going to less color discretization you can kill gradients, but most of the time it in life must be ok, as it is just intensity that changes, not color. Nature made our eyes as they are for a reason.
The goal of ACES is to somehow "deconstruct" the color spaces of each footage source and allow the user to "re-assemble" the footage in a unified way so that all different cameras match as if they were captured by the same camera.
I'll make it even worse for you. Color in your head does not exist in unrelated form to surroundings (and even to your mood and many other things).
You can read about some things at http://rit-mcsl.org/fairchild//PDFs/AppearanceLec.pdf
What do you think about this video from Wolfcrow?
Great vid
Videos like the one @inqb8tr posted 2 posts back and this other below one that he made are the types of things I'd like to see more of in here. Don't just talk about what it is you've discovered, or theorize... prove it, and make a video demonstrating it. Or if there's a process you're exploring but you're not an expert on it yet, find someone who is and post a video of theirs.
I think that there is some kind of confusion about the meaning of bits-per-colorspace. Normally we talk about bits-per-color, with some adjustment, as necessary, for alpha (transparency) channel.
In the 80's we had simple monitors capable of "16 bit color", without any reference to bits-per-color, meaning they were able to discern 2^16 = 65,536 different shades of color.
Then we had a more advanced type, with "24 bit color", meaning 2^24 = 16,777,216 different color combinations. This was actually 8-bit per color (namely, 8bit= 2^8=256 different shades for each of the R,G, and B, which gives us, for all 3 colors combined, 256 ^3 = 16,777,216 total combinations)
Then we had "32 bit color monitors" which had the same 8 bit per color (24 bit base color) + 8 bit alpha channel for localized intensity variations. So the 32 bit color monitors were actually 24 bit color + 8 bit alpha channel.
In the last few years we had monitors, available in mainstream pro applications, which can discern true 30 bit color. To differentiate them from the old and inferior "32 bit color" label they are normally called now 10-bit-per-color (namely R,G,B) monitors. These can show up to 2^30 different shades of colors (more than a billion shades).
An example of such a monitor is the NEC PA272W-BK 27", which also has a LUT for displaying different 30 bit colors combinations in different spaces. A cheaper monitor, LG 27UD88-W 27", is advertized as pseudo 10 bit per color - but it is actually (8-bit + A-FRC) which gives you the appearamce of 10 bit per color by using true 8-bit-per-color technology and fast temporal dithering between adjacent colors to increase the apparent color space.
Nowadays pro software products can easily support 10-bit-per-color, and some may even go as high as 12-bit-per-color, although I doubt that you can find a monitor for the latter..
Therefore, we need to be more precise - the old FCP was possibly 8-bit-per-color, not "8 bit color" - nobody would have used it had it shown only 2^8 = 256 color variations....
For a good treatise on colors and ProRes, see (specifically P. 9) https://www.apple.com/final-cut-pro/docs/Apple_ProRes_White_Paper.pdf
People use non standard terminology, which is where a lot of confusion stems from.
You have used "bits per color" when you mean bits per color (RGB) channel.
Bits-per-colorspace is meaningless as colorspaces are independent from the bit depth used to encode the image. Colorspaces pre-date digital technology.
It is also important to separate monitor color bit depth discussions from codec or format bit depth discussions due to monitors not using alpha channels and the differences between RGB display devices and YUV and RGB (plus other formats such as LAB) encoded signals.
The ideal situation is to discuss Bits Per Channel as it is a clearer way to describe the encoding of information. An example is as you describe above; 32 bit is RGBA or YUVA. which is visually identical to 24 bit RGB or YUV, but with an alpha channel. They are BOTH 8 bits per color channel.
The old FCP was not limited to 8 bit or 10 bit with it's internal processing, it worked somewhat like Resolve with an independent floating point processing engine. the limitation was only the chosen sequence format and render settings.
@Kob thanks for the white paper it was very illuminating. I edited the pilot post to reflect bits per channel and other clarifications.
Shian " And speaking if AVID does anybody know if they've implemented LUTs into their platform, or plan to?"
Yes, Avid media composer 8 have LUT option in clip´s source settings once imported to the bin.
I've barely every seen any needs of 32bits (almost infinite) outside of high end compositing that requires almost unlimited value (such as ZDepth, Aliasing Pass and Alpha informations in Sub Surface Scattering Material). Average humans eyes are close to 9 bits.
@caveport Noted. Thanks.
So this video kinda of put me on the map... and it was actually around this time that I switched to 32bpc and 444 for everything.
The main reason was the blur you see for the titles at the beginning was done in-camera and was super smooth. I added a tiny bit more blur to it so the girl walking into focus doesn't get there before we cut away so her walking in doesn't distract from the titles. In AE I saw banding, switched to 16bpc, still banding, switched to 32, banding gone. Went to export to 422 HQ ProRes, banding back... exported 444, banding gone. And it was in comparing the 2 files that I began to really see color degradation between the 444 and the 422, and so I switched to 32bpc and 444 for everything. Now to retain the lack of banding in the mp4 I had to bump up the bit rate to 14k... but of course when Vimeo compressed it it came back...and the flares banded badly so maybe it's not totally worth it for the web, but I had uploaded a 5k 720p mp4 on the first go 'round and it looked terrible after conversion, and this looks okay.
Years ago I did a color test in FCP7 and AE (16bpc) using a lot of the same techniques I still use today, a lot of masks and gradients trying to see if I could duplicate what I could do in AE inside FCP, and even after changing the processing to 10bit in the timeline settings, it would still band and the color composites with the layers wouldn't affect the underlying image correctly. And maybe therein lies the issue. If you're just adjusting colors maybe you don't need all that power... but I like to paint. And IMO I think that is what distinguishes pro level colorists from the guys just applying looks and LUTs and grabbing color wheels, and curves to shift colors and luminance.
It looks like you're new here. If you want to get involved, click one of these buttons!