Creative Communities of the World Forums

The peer to peer support community for media production professionals.

Activity Forums Adobe After Effects Expressions Arri Mscope Challenge

  • Arri Mscope Challenge

    Posted by Sacha Sojic on February 12, 2010 at 5:39 pm

    Hello to all,

    I’m head editor on a Québec feature that we are shooting in Cuba until mid-March and I’ve got a bit of doozy for the Expressions community today. It concerns Arri’s proprietary Mscope format. For those of you not familiar with the process there is a detailed explanation of the format on Arri’s website along with a white paper that explains the process. However, for the purposes of this post, here are the broad lines in a few paragraphs.

    Mscope is a shooting format developed by Arri for use with the D-21 camera that allows the capture of true anamorphic material using HD technology. In this mode, the camera utilizes the entire height of the 4:3 sensor, producing an image which consists of 1920×1440 HD pixels. Mscope then splits this image into two (4:2:2/PsF) HD-SDI streams. The A-channel contains all even lines of the original full frame (with line count starting from 0); while the B-channel contains all odd lines. Each stream has resolution of 1920×720 active pixels with an additional 180 blank lines on the top and bottom of the frame to fill up the unused portion of the 1080 frame.

    These HD streams can be recorded synchronously on any HD recorder with dual stream recording capability (such as the SRW-1) or even on two separate HD recorders. The latter is the option that we are using on our feature: Both streams are recorded using 2 separate AJA KiPro recorders. The result is 2 separate ProResHQ Quicktimes for each take (Odd lines and Even lines), both with matching timecode.

    Though each clip has half the vertical resolution of the original frame, the picture quality is perfectly suitable for on-set monitoring. Another added benefit is that each stream is a normal looking (non-squeezed) image and (in ProResHQ) is of a quality good enough (stunning actually) for offline editing.

    During the DI, as a first step, the two HD streams are recombined to produce a high resolution, squeezed, Mscope image. Here is a simple figure illustrating the process:

    This is where I sollicit your help in tring to automate this process using After Effects Expressions. I am wondering if it is possible to import both clips (for a given take) into a 1920 x 1440 comp and selectively and alternately map their lines in the comp according to the figure above.

    There are macros that exist for Fusion and on Quantel systems but none that I could find for After Effects. Mscope is a fairly new and still seldom used process but it is one that produces some of the most stunning images I’ve ever seen. It would be great to find a way to integrate AE into the composting workflow for Mscope projects.

    Best,

    Sacha Sojic
    Havana, Cuba

    Sacha Sojic replied 16 years, 2 months ago 2 Members · 7 Replies
  • 7 Replies
  • Kevin Camp

    February 12, 2010 at 7:09 pm

    this sounds really interesting… can we get a sample of the two 1920×720(1080) images that make up the 1920×1440 frame?

    i’m kind of wondering if the 3d glasses effect might be useful… what you essentially will have is a stereo pair of frames, which is what the 3d glasses effect is designed to combine into one frame. it’s just that your frames come from the same camera (rather than two separate cameras).

    my concern is that the 3d glasses effect is expecting two frames that are the same size as the final frame, so it may do some vertical scaling with interpolation, which is not what you want. you want to separate each line and insert lines from the other image… or possibly scale it vertically by means of line duplication (no interpolation) and replace every other line with lines from the other image.

    you may be able to use draft settings to prevent ae from interpolating when scaling and then combine the lines…

    but these are all things that would need to be tested.

    Kevin Camp
    Senior Designer
    KCPQ, KMYQ & KRCW

  • Sacha Sojic

    February 13, 2010 at 11:34 pm

    Hi Kevin,

    I’ll try to get clearance to post 2 frames from our camera tests. However, the best way to test a solution is to use 2 colored frames (a blue and a red one, for instance). A successful merge is then plainly obvious, much more so than using actual footage in my opinion.

    Thanks for the 3D glasses suggestion. I hadn’t thought of giving that effect a shot. I’ll try and see if it leads anywhere.

    Best,

    Sacha Sojic

  • Kevin Camp

    February 15, 2010 at 5:00 pm

    [Sacha Sojic] “the best way to test a solution is to use 2 colored frames (a blue and a red one, for instance). A successful merge is then plainly obvious, much more so than using actual footage in my opinion.”

    3d glasses does well with this test, if you compensate for the 720 lines in a 1080 frame of the original footages…

    there are two ways that can work:

    method 1:

    create two comps, one for even lines and one for odd that are 1920×720 (square px, since it sounds like the footage will be square px). this will crop out the black letter boxing. drop the appropriate footage into each and then bring those comps into a 1920×1440 comp (anamorphic px).

    then create a new layer (comp size) and add the 3d glasses effect. set the left view to the odd comp and the right view to the even comp, then set the 3d view to ‘interlace upper L lower R’ — note, i’m assuming that the odd lines footage relates to the upper field and even lines footage is the lower field with this mscope footage splitting technique, but that may not be the case…

    method 2:

    drop the two pieces of footage into a 1920×1440 anamorphic comp, create a new comp sized solid and add the transform effect. uncheck the uniform scale option and set the scale height to 150. this is to effectively crop the 120 pixels off the top and bottom of the frames when they get re-interlaced.

    then add the 3d glasses effect after the transform effect. use the same settings as above.

    using 2 colored solids in as the test, both methods produce perfectly interlaced, 1920×1440 anamorphic final frames…

    what this test doesn’t show is if there is any interpolated scaling to the original frames that would degrade the final frame. that’s where actual footage/frames would be most helpful. obviously, you can create either method above and test it with real footage. if it doesn’t work, there are some other ways that might work…

    the real trick is to figure out how not to have ae do interpolated scaling, and, not be to difficult to do. you can definitely avoid the scaling issue, by creating 720 precomps (one for each line) for both the even and odd line footage, then put those 1440 comps into a final comp all spaced out and in the right order, but that’s a lot of work, and it would be nice to not have to do that…

    Kevin Camp
    Senior Designer
    KCPQ, KMYQ & KRCW

  • Sacha Sojic

    February 16, 2010 at 1:37 am

    That’s very cool, Kevin. I’ll give it a try. I’ll get back to you regarding the interpolation issue once I’ve tested it with actual footage.

    Sacha Sojic

  • Sacha Sojic

    February 17, 2010 at 5:03 pm

    Kevin,

    Just ran a pair of shots through 3D glasses. It turns out the plugin is, in fact, interpolating. My first hint came when I tried the method with 2 the colored frames. Although it seemed to be interlacing properly, the topmost and bottommost lines of the comp were of a darker hue, indicating that the colored solids had in fact been stretched to 1440 pixels (producing an anti-aliasing effect along those edges) before half of their lines were dropped and interlaced together.

    Performing the test with footage confirms this. The resulting image is blocky and is identical in appearance to a single channel being stretched vertically to 1440 pixels.

    Great try though. Do you have anything else in mind? And given the fact that this thread is taking a “Non-Expressionist” tangent, should I cross-post it to the other AE forums?

    Sacha Sojic

  • Kevin Camp

    February 17, 2010 at 8:08 pm

    you might move post this in the ae forum too… you may get more responses.

    but you may be able to get the 3d glasses to work by setting the 3d layer’s layer quality to ‘draft’ rather than best — the quality setting looks like a diagonal line in the switches panel, click it to make it look like a jaggy line to set it to draft.

    if it does work, you’ll need to set the ‘best settings’ to use draft quality in the render settings. don’t use the ‘draft’ preset, it changes other settings, like resolution and such… if it does work you could create you own ‘mscope’ preset with the settings that work.

    Kevin Camp
    Senior Designer
    KCPQ, KMYQ & KRCW

  • Sacha Sojic

    February 22, 2010 at 5:43 pm

    Hi Kevin,

    Sorry for the late reply. Access to the net here in Cuba is spotty at best, courtesy of uncles Raul and Fidel…

    I had already thought of giving the draft setting a try, as you suggested in your original post. It was a success. The plugin does stretch each frame vertically 200% before it interlaces both channels, but in draft mode it effectively doubles each line without interpolating. That’s perfect.

    I conformed a small part of the edit and combined the channels using this method and I obtained a perfect Mscope anamorphic sequence. Wow.

    We were already in awe of our great AJA KiPro and Final Cut workflow. Thanks for your help in bringing AE into the fold.

    Best,

    Sacha Sojic

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy