Creative Communities of the World Forums

The peer to peer support community for media production professionals.

Activity Forums Adobe After Effects AE6.5 to 10bituncompressed codec/FCP5

  • AE6.5 to 10bituncompressed codec/FCP5

    Posted by Felix Mack on July 21, 2005 at 7:29 pm

    Hi –

    I’d really appreciate it if you could try this quick test:

    1. In AE, make a 720×486 comp, create a solid, and apply the ramp filter. Make the ramp go from black on the left to white on the right.
    2. Render this comp in the Uncompressed 10bit codec (the apple FCP ones, not Black Magic or other).
    3. Import the clip you just made, drop it into the same comp, and render the clip again with the same settings.
    4. Import both quicktimes into FCP, make a Uncompressed 10bit Sequence, drop the clips on the time line and look at the scope.

    And here are the questions:
    1. When you look at the original clip, does the waveform monitor show a perfectly straight line, or is it curved?
    2. When you look at the rerendered clip, does the waveform monitor look the same as the previous clip?

    That’s all. I am noticing rendering to the 10bit codec seems to make everything brighter via a gamma boost.

    Thanks.

    Paul Mcglaughlin replied 20 years, 9 months ago 5 Members · 6 Replies
  • 6 Replies
  • Rich Rubasch

    July 22, 2005 at 2:05 am

    Holy crap! I was just seeing this with the Aurora Pipe 10-bit codec. I rendered an animation that had some 10-bit footage in it. After the render I had to fix a hair that was in the shot. I brought in the Aurora 10-bit rendered file and fixed the hair and when I compared the fixed move (2nd render) with the first render it was brighter. I applied a Brightness/Contrast to the 2nd render with about a -10 Brightness and a +4 Contrast setting. This got it close but it was more than just a brightness setting.

    I’m real concerned about this especially if it is with Apple’s own codec.

    I am still on Panther 10.3.9 AE 6.5 QT 6.5.2 G4 dual 1-gig.

    I have never seen this with Animation codec or any other 8-bit uncompressed codecs like the Aurora or AVID codecs.

    What do you think, Graeme?

    Rich Rubasch
    Tilt Media

  • Matthias Wimmer

    July 22, 2005 at 1:44 pm

    Hi ,
    i had similar problems. I rendered ae-projects on different machines. All of them had the
    blackmagic codec installed, some of them also the aja-codec. My observation was, that on machines
    with just one of the codecs installed it was no problem to handle the quicktimes with the other codec.
    It just happened to be brighter or darker (i can not remember exactly).
    I expected a warning message like

  • Chris Tomberlin

    July 22, 2005 at 5:32 pm

    I think this is a problem on the second render. I think the problem is actually when you bring a file into AE that uses a codec that is in the v210 format, rather than when you render that file. In other words, if you change the AE preference file so that the [“QuickTime 64-bit Input Codecs”] section is set with “v210” = “0” and the [“QuickTime 64-bit Output Codecs”] section is set with “v210” = “1”, you can render a 10-bit quicktime correctly. What I believe is going on though, is that by setting the input section to “v210” = “0”, you are telling AE to see 10-bit quicktimes as 8-bit files, so either way, you can’t keep a file 10-bit through the whole process.

    -Chris Tomberlin
    OutPost Pictures

  • Rich Rubasch

    July 23, 2005 at 3:03 am

    That is exactly what I was suspecting..that it was the second time around in AE that was causing the luma shift. So, not the best solution, but where do you change the preference setting you were talking about?

    Rich Rubasch
    Tilt Media

  • Chris Tomberlin

    July 23, 2005 at 3:41 pm

    Locate the AE preferences file (Users/Library/Preferences/Adobe After Effects 6.5 Prefs) and open it with a text editor. Scroll down toward the end of the document until you find a section that looks like this:

    [“QuickTime 64-bit Input Codecs”]
    “DV10” = “1”
    “Mczm” = “1”
    “NO16” = “1”
    “SVQ1” = “0”
    “Shr7” = “1”
    “v210” = “1”

    [“QuickTime 64-bit Output Codecs”]
    “DV10” = “1”
    “Mczm” = “1”
    “NO16” = “1”
    “Shr7” = “1”
    “rle ” = “0”
    “v210” = “1”

    You will probably have more lines than this, but it will be the right section. Replace the v210=1 with “v210” = “0” and you should have a work around. Be advised though, that unless I’m mistaken, this essentially turns off 10-bit file recognition on import and makes the file process as 8-bit.

    Good luck

    Chris Tomberlin
    Editor/Compositor/Owner
    OutPost Pictures

  • Paul Mcglaughlin

    August 3, 2005 at 6:27 pm

    Hi all –

    I’m also seeing this problem. Here’s our workflow:

    1. Render to Animation, 960×540, sq. pixels, 23.976 fps. Pass textless backplate to PS.
    2. Reimport 960 render, and render to NTSC & PAL, 10 bit (used to be BM 10 bit, but now Uncompressed 10 bit).
    3. Setup Timeline in FCP, using rendered video, and PS static pages. (Here’s where we see the level shift, esp. when going from a motion transition to a static landing page).
    4. output all to Digibeta.

    I should mention that ONLY the layoff FCP computer is running Tiger, QT7 and FCP 5. All others run Panther, QT 6.5.2, AE 6.5 and FCP 4.5.

    I’m going to experiment to see if I can find an acceptable intermediate codec for rendering that doesn’t exhibit the gamma shift.

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy