[Greg Sage] “I know I need to up the bitrate, but not clear whether I should be using 16 or 32. Also, I already have very long render times with 8 bit, so does this get much worse as bitrate increases? “
This is not bit rate. This is bit depth, and it refers to the precision with which calculations are made.
With 8bpc, the range from black to white (per channel) is 0 to 255, with whole-number steps. This is “millions of colors,” 16,777,216 of them to be precise.
With 16bpc, the range from black to white (per channel) is 0 to 32768, again with whole-number steps. This is “trillions of colors,” 35,184,372,088,832 of them.
Note that black is still black, white is still white, red is still red, etc., and your final delivery encoding and display will probably be 8bpc anyway. Working in 16bpc just gets you shades in between the extremes for calculations. In practical terms, that means that doing something like adjusting the gamma up in a levels effect at the top of an effects stack and then pushing it back down later on will give you less crunching in the midtones, or stair-stepping in the histogram.
With 32bpc, we stop doing integer math and start doing floating point math. Black is 0 and white is 1, with the numbers between, below and above represented as real numbers with decimal points as necessary.
Although you can’t display a white brighter than white (1), you can do intermediate calculations with it. This means that superbright highlights can be preserved across effects, even if they are brighter than can be displayed.
See here for more:
https://helpx.adobe.com/after-effects/using/color-basics.html#color_depth_and_high_dynamic_range_color
Moving a project from 8bpc to 16bpc doubles your RAM requirements, improves precision, but doesn’t radically alter the way calculations work; moving it from 8bpc to 32bpc quadruples RAM requirements and can drastically affect the way effects are calculated. CPU requirements does not change linearly like RAM. You’re still doing the same number of calculations per second, you’re just doing them with larger containers for the data.
[Greg Sage] “Next, as for working space, very confused on this, but I believe I should be working in sRBG and linearizing the workspace, correct?”
sRGB and Rec. 709 are very, very close. Rec. 709 is the normal profile for HD video work.
Linearizing the workspace does not affect how footage looks coming in or going it. It does affect the math used in compositing operations, which affects the look of your composites and effects.
Linear light compositing usually gives more natural-looking composites.
https://helpx.adobe.com/after-effects/using/color-management.html#linearize_working_space_and_enable_linear_blending
https://www.artbeats.com/assets/written_tutorials/pdfs/linear_light.pdf
[Greg Sage] “I don’t see anywhere to set color management profile on output, but it’s my understanding that YT ignores this anyway, correct?”
This is done in the output module from the render queue. If you don’t specify an output color management profile, then the working space will be used.
As for color management and YouTube, it’s not just up to YouTube how this works. Color management is required at every stage of the pipeline to ensure consistent results. If your viewer’s browser is not color-managed, it may not be consistent.
Walter Soyka
Designer & Mad Scientist at Keen Live [link]
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
@keenlive | RenderBreak [blog] | Profile [LinkedIn]