Creative Communities of the World Forums

The peer to peer support community for media production professionals.

Activity Forums Creative Community Conversations BMCC alternate workflow

  • Gary Adcock

    September 6, 2012 at 5:53 am

    [Oliver Peters] “Also tried 16-bit TIFFs with the QT refs. That doesn’t work. They don’t like 16-bit TIFFs. “

    Yes.
    I have to admit that I have had some serious issues trying to work in 16bit using the everyone of the available tools and very specially when using full range 16bit DPX frames from the Sony F65.

    The only current tool I was happy with the conversion result was from the Adobe CS6 Media Encoder, which I used to make 4K60p ProRes 422 versions from those DPX frames so that I could do a rough edit from, then ran into the “nothing really handles much 4K Resolution” very well on the desktop.

    This project was shot for a new product launch at IBC.

    The 4096×2160 at 59.94 fps in PR422 requires about 200 MBps for playback.

    I cut the entire project in REALTIME on my Retina MBP in FCPX over Thunderbolt using a Promise R6 array, generating about 18T of content in 4 days (I was using 2 laptops to create media due to a mountain lion issue, so I had one running 10.7 to handle specifics (like the Sony F65 view app does not run under 10.8), but all the editing was done on my Retina.

    The final edited project weighs in at approx. 1.4T for 3:30 TRT when relinked back to the 16bit DPX sequences.

    gary adcock
    Studio37

    Post and Production Workflow Consultant
    Production and Post Stereographer
    Chicago, IL

    https://blogs.creativecow.net/24640

    follow me on Twitter
    @garyadcock

  • Gary Adcock

    September 6, 2012 at 5:56 am

    [Jeremy Garchow] “Why stop there? BMDCC XYZ is an image sequence with RAW. Let’s do it all!”

    ah….

    most of the higher end use some level of sequential frames

    that list also includes Arriraw, DPX, the BMCC’s CinemaDNG format but do not forget OPEN EXR for the VFX guys.

    gary adcock
    Studio37

    Post and Production Workflow Consultant
    Production and Post Stereographer
    Chicago, IL

    https://blogs.creativecow.net/24640

    follow me on Twitter
    @garyadcock

  • Gary Adcock

    September 6, 2012 at 6:02 am

    [Rafael Amador] “I’ve been making too some tests with Photoshop (importing as 8b and 16b) and then bringing the files to AE (difference), and there is no doubt that the .dmg’s are more than 8b depth.”

    herein lies one of the issues in modern imaging.

    Existing 16bit workflows are designed around VFX workflows,
    however I am using CAMERA GENERATED 16bit files, not creating something in Photoshop or Maya and yes I am seeing different responses with the media.

    gary adcock
    Studio37

    Post and Production Workflow Consultant
    Production and Post Stereographer
    Chicago, IL

    https://blogs.creativecow.net/24640

    follow me on Twitter
    @garyadcock

  • Rafael Amador

    September 6, 2012 at 12:12 pm

    [gary adcock] “Existing 16bit workflows are designed around VFX workflows,
    however I am using CAMERA GENERATED 16bit files, not creating something in Photoshop or Maya and yes I am seeing different responses with the media.”

    I haven’t make the files by my self , but I’ve used the same John Brawley’s BMCC “.dmg” files that Oliver has been using for his tests.
    I guess that when working on Photoshop (or whatever other Graphic application), the main caveat is on choosing and keeping the proper Color Profile.
    rafael

    http://www.nagavideo.com

  • Walter Soyka

    September 6, 2012 at 1:19 pm

    [Jeremy Garchow] “Why stop there? BMDCC XYZ is an image sequence with RAW. Let’s do it all!”

    I’m in.

    My point there, though, was that these advanced formats that you and Gary mention require much more than than the ability to treat a set of sequential stills as a movie.

    Plain vanilla image sequence support by itself would get you things like TIFFs. Not all image formats are created equal. For example, any of the RAW image formats, you have to add debayering and RAW interpretation options. For OpenEXR, you need meaningful multi-channel image support (which brings with it a lovely can of worms on the compositing side).

    I am merely cautioning that one should be careful what they feature request — you might just get it!

    Walter Soyka
    Principal & Designer at Keen Live
    Motion Graphics, Widescreen Events, Presentation Design, and Consulting
    RenderBreak Blog – What I’m thinking when my workstation’s thinking
    Creative Cow Forum Host: Live & Stage Events

  • Jeremy Garchow

    September 6, 2012 at 1:36 pm

    [Rafael Amador] “I guess that when working on Photoshop (or whatever other Graphic application), the main caveat is on choosing and keeping the proper Color Profile.”

    Ironically, FCPX seems to hint at having this capability.

    We just need to know what is happening, and of course we need more options, see here:

    giveuscontrolplease.png

    Jeremy

  • Jeremy Garchow

    September 6, 2012 at 2:00 pm

    [Walter Soyka] “I am merely cautioning that one should be careful what they feature request — you might just get it!”

    I’m over it! 🙂

    We are discussing an almost 3k camera for almost $3k that shoots 12bit RAW DNG.

    Our cheap post tools should be able to work the cheap production tools.

    FCPX has a direct line to the Aperture library which has RAW capability. Changes in Aperture are immediately reflected in FCPX. Raw control panels should be “easy” to add. Those are huge quotation marks around easy, by the way.

  • John Heagy

    September 6, 2012 at 2:10 pm

    If people appreciate inventive workflows like this enabled by Quicktime reference movies, please ask Apple to add reference movie creation in AVFoundation. The lack of ref mov support in AVFoundation is why FCPX can’t export one.

    If anyone wants to send feedback I suggest using the “Provide Final Cut Pro Feedback..” under the “Final Cut Pro” menu in FCPX and the Apple Feedback site, both OS X and Quicktime.

    https://www.apple.com/feedback/quicktime.html

    https://www.apple.com/feedback/macosx.html

    Thanks
    John

  • Walter Soyka

    September 6, 2012 at 3:13 pm

    [Rafael Amador] “QT supports 16b RGB (TIFF/PNG still sequences and Microcosm QT Movies) even if QT Player can’t process or display more than 8b.”

    I’ve just tested 16b Microcosm movies with Rafael — and they don’t seem to work in FCPX at all (black screen). Ae CS6 reads them correctly.

    Walter Soyka
    Principal & Designer at Keen Live
    Motion Graphics, Widescreen Events, Presentation Design, and Consulting
    RenderBreak Blog – What I’m thinking when my workstation’s thinking
    Creative Cow Forum Host: Live & Stage Events

  • Walter Soyka

    September 6, 2012 at 3:40 pm

    [Rob Mackintosh] “As Oliver noted in an earlier post, FCPX optimizes all stills to ProRes 422. Do you think this may be influencing the results of your tests. I wouldn’t have thought FCPX was using these files an intermediate codec for rendering deep formats, but who knows. Starting with a 16 bit RGB file, transcoding it to a 10 bit YCbCr intermediate, processing it in 32 Bit float linear RGB then exporting as 16 bit RGB would seem less than optimal.”

    FCPX does not seem to be creating optimized media from my 16b TIFFs — there is no “High Quality Media” folder in my Events folder.

    I created a 16b temporal ramp (RGB values increase by 1 on a 0-32767 scale on each frame), saved out 16b TIFFs, imported them into FCPX and cut a few in a frame at a time (poor man’s image sequence), and exported a TIFF sequence.

    FCPX preserved the full 16b depth, as shown by examining the output in FCPX. It was not clipped to 10b in the middle.

    However, like the 10b TIFF test, the results were very close but not mathematically identical. The first error — only a single bit in all channels — occurred after 26 frames. Certainly tolerable in practical applications (though, going back to a completely hypothetical OpenEXR example, this would actually very, very slightly alter non-image data stored in image buffers), but it does indicate there’s some transformation or quantization going on somewhere in the FCPX processing pipeline.

    Walter Soyka
    Principal & Designer at Keen Live
    Motion Graphics, Widescreen Events, Presentation Design, and Consulting
    RenderBreak Blog – What I’m thinking when my workstation’s thinking
    Creative Cow Forum Host: Live & Stage Events

Page 5 of 10

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy