Creative Communities of the World Forums

The peer to peer support community for media production professionals.

  • Walter Soyka

    December 6, 2011 at 12:18 am

    [Shane Ross] “But I do see the total need to have what you see in the NLE match your QT exports. Because QT exports aren’t broadcast quality…they are web and computer based, so completely different. I get that. I just don’t get using the image you see on the computer display (and the FCX interface) as a guide to what your image will look on TV. Because computer displays and TVs have different color spaces.”

    RGB/YUV is not reversible with 8-bit processing because you can’t encode values below black (0), above white (255), or fractional values (like 123.4567), leading to clipping and quantization.

    However, the RGB/YUV transformation is mathematically reversible when using floating point RGB calculation as FCPX does.

    With a profiled monitor, ColorSync will translate from the intended theoretical color space (Rec 709) to the actual physical display space. I have read that the iMac displays are very close to Rec. 709, but I don’t have any first-hand information there (nor on with what precision ColorSync itself operates).

    Walter Soyka
    Principal & Designer at Keen Live
    Motion Graphics, Widescreen Events, Presentation Design, and Consulting
    RenderBreak Blog – What I’m thinking when my workstation’s thinking
    Creative Cow Forum Host: Live & Stage Events

  • Jim Giberti

    December 6, 2011 at 12:25 am

    [Shane Ross] “monitors,”

    How do you know? Because FCX doesn’t allow for external broadcast monitoring. So how do you know what you see in FCX is what you see on your broadcast monitor? If it does match…that’s cool! Neat trick. I guess I need to know how you are comparing the shots in order for my head to wrap around it.

    I seem to not be explaining myself to you clearly Shane.

    I own the creative agency and the production studios, I have a media dept and AE’s that work with the different network and cable/satellite providers that control broadcast schedules. So I know when new work is airing on what shows and we monitor our work on air as part of our job.
    We have relationships with the engineers that handle our material to insure it looks good to them on their end. I insist on that for every spot that goes to air.

    I have carefully calibrated monitors with years of experience with how they match my average broadcast output (ie the phosphors on a perfectly calibrated JVC 17″ CRT always shifts slightly green compared to the connected, correctly calibrated 30″ Apple screen).
    I have a lot of experience with a lot of gear in a lot of studios. that’s how I know what my work looks like and how I can tell the difference between the train wreck of FCP legacy and Quick Time and FCP X and ColorSync.

    That’s pretty important when all of our work, whether it’s NBC or local cable, is delivered as .h264s

  • Shane Ross

    December 6, 2011 at 12:30 am

    Gotcha.

    I’ll shush now.

    Shane
    Little Frog Post
    Read my blog, Little Frog in High Def

  • Jim Giberti

    December 6, 2011 at 12:52 am

    [Walter Soyka] “Of course, it wouldn’t help with judging field issues.

    This really is the 800 lb gorilla Walter.

    I’m increasingly frustrated (not quite the language I used yesterday) delivering SD content of all our HD content.

    I know we’re lucky in relative terms that we’re at a point where virtually all our work is captured HD, and we can maintain that throughout. Even a corporate piece we’re just finishing, that would have been bumped down to DVD, is now going out on thumb drives. I’m loving living in an all QT environment now that I can trust it.

    I know I’m spoiled. Some of the stations we deal with (typical in local network affiliates) even though they broadcast in HD, still haven’t upgraded their local output, so we send that work 480 letter boxed, and field issues still rear their ugly heads.

    It’s a whole new world and evolving station/group at a time. Some trusted stations will take our straight 1080p work and handle the 480 transfer for the SD affiliates for us.

    I can’t wait till things get standardized.

    Yeah.
    Sure.

  • Jim Giberti

    December 6, 2011 at 12:53 am

    That’s strange Oliver.

    The change is ground breaking on this end.

  • Daniel Frome

    December 6, 2011 at 1:17 am

    Hi Jim, thanks for bringing attention to this. I have plenty of memories of exporting h264 quicktime files out to the network for rough cut approval, only to have them blast the “dull colors” (we were working in animation, so color shifts were 100x more noticeable than on live action).

    If I might ask a further question: If a computer without FCPX watches your h264 outputs, is the color still properly calibrated for them? I ask this because I found that many quicktime movies actually did contain the proper color information — it was quicktime that seemingly didn’t interpret it properly.

    For example, we could watch a 1080p quicktime movie in VLC player and get a more accurate view than if we viewed it in Quicktime 7. This indicated to me that the problem wasn’t necessary the exported file, but the “player.”

    My question is ‘how did they fix it’ I suppose: since it seems like the weak link is equally the quicktime playback engine… which would still inherit these issues on the client machine? Hopefully I’m asking this clearly enough…

  • Robert Brown

    December 6, 2011 at 1:30 am

    [Walter Soyka] “After Effects, for example, does color management brilliantly.

    Really? I never got AE’s CM. Maybe I should read more about it but I just leave it off. Nuke is my favorite as everything gets converted to linear on the way in with viewing LUTs to compensate for your what you computer monitors will do to it. Then it gets converted to whatever space you need on the way out.

    I’d like a good explanation of how AE does it as I really like AE but it seems whenever I turn on Color Management, something goes wrong somewhere and doesn’t look right. Nuke has individual control over input, monitoring and output which to me makes a lot of sense. With AE I’m never quite sure what it is doing to what.

    Robert Brown
    Editor/VFX/Colorist – FCP, Smoke, Quantel Pablo, After Effects, 3DS MAX, Premiere Pro

    https://vimeo.com/user3987510/videos

  • Oliver Peters

    December 6, 2011 at 1:44 am

    Jim,

    I’m glad it’s working better for you, but the gamma issue is a QT playback issue more than anything else, so it really doesn’t matter how it looks in FCP X. I’ve never had a problem with how I see things in FCP 7 and the output in broadcast, within the tolerances of display calibrations. Same for FCP X and Premiere Pro.

    I recently had a spot done in FCP X which I cut in ProRes at an SD size. I simply could not get a proper looking conversion through Squeeze using the exported ProRes from FCPX. I was converting to H264 and WMV. If I took that file into FCP 7 and rendered an uncompressed version and exported it, that looked correct. By correct, I mean it looked the same as it did in FCP X. A direct H264 export using the share function or Compressor also looked fine. So ColorSync sort of works with some codecs and in a closed environment, but not universally. Some of this appears to be codec dependent.

    Ultimately this has little if anything to do with any of the NLEs. The problem is QT as a player and decoder, most of the time, because QT is trying to second-guess the image in order to make it right for your display.

    Oliver

    Oliver Peters Post Production Services, LLC
    Orlando, FL
    http://www.oliverpeters.com

  • Christian Schumacher

    December 6, 2011 at 1:49 am

    Given the myriad of digital aquisitions of today’s enviroment, I’d say “one size does not fit all” as FCPX has displayed problems in this regard as well. Granted I had those in 10.1 but for instance there’s a third party app that deals with this problem. It’s FCPX compatible. YMMV.

    Gamma Shift Detector
    Detects gamma shifts between source and destination media. It will tell you if the shift directly affects the pixels in the image or if it is just a mismatch in metadata.

    https://www.digitalrebellion.com/promedia/

  • Walter Soyka

    December 6, 2011 at 2:19 am

    [Robert Brown] “Really? I never got AE’s CM. Maybe I should read more about it but I just leave it off. Nuke is my favorite as everything gets converted to linear on the way in with viewing LUTs to compensate for your what you computer monitors will do to it. Then it gets converted to whatever space you need on the way out.”

    AE’s color management system works with ICC profiles, converting all incoming footage to a common space for processing, then optionally converting them to separate profiles for display and render.

    Imported items are assigned a color profile through Interpret Footage (as in Nuke’s Read node). These are translated into the project’s user-defined working space, set in the Project Settings (which may optionally be linearized as in Nuke). For display, the resulting composited image can be converted for viewing using the display’s native profile, or other profiles can can be simulated on the display (like Nuke’s viewer LUT). On output, a comp may be rendered in the working space (by default) or converted to a separate output space in the output module’s Color Management tab (like Nuke’s Write node).

    Adobe published a white paper on Color management workflow in After Effects CS4 [link] which also applies to CS5 and CS5.5.

    Any more questions, stop by the AE forum. Color management comes up from time to time, and maybe we could help you get more comfortable with it.

    Walter Soyka
    Principal & Designer at Keen Live
    Motion Graphics, Widescreen Events, Presentation Design, and Consulting
    RenderBreak Blog – What I’m thinking when my workstation’s thinking
    Creative Cow Forum Host: Live & Stage Events

Page 3 of 7

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy