Christopher Reig
Forum Replies Created
-
[Richard Sanchez] “Rather than set your shot log to “Maintain Events as Logged” I would usually drag the AAFs into bins to create my master clips, and then select all clips with the Shot Log set to “Merge with known master clips” to apply the CDL, and Sound TC information that is usually contained in the ALE.
I would still provide the ALE, as I’ve seen meta data (particularly Sound TC) magically disappear, which require me to manually re-enter it based on the Sound TC burn, or remerge the ALE (which can sometimes be dangerous, because as Michael has mentioned before, the merge feature is not a true merge feature and an obliterate existing custom metadata on clips if you’re not careful. Speaking of burns, I generally ask for a flash timecode burn with the video tc and filename on the bottom left, and the Sound TC and Sound Roll flash burned into the bottom right.
“
Thanks for your reply, Richard!
I have always been very wary of ALE importing, for the aforementioned reasons. It really is very easy to nuke your metadata if you aren’t 100% sure of what you are doing.
When it comes to burn ins, I usually set those up with the editor at the start of the show. Sometime they want flash burns with the crucial information, other times they want it persistent through the whole thing. It’s the same deal with aspect ratio masking. Sometimes they prefer to receive the footage ‘open gate’, if it’s a 2.40 film from a 1.78 source image. Other times, they just say ‘send it to me masked, as it was shot’.
As for the colour, it’s interesting you mentioned the CDL import via ALE. Most times the colour decisions set by the DP are baked in when creating the offline media. But it does intrigue me to be able to apply a colour transformation of the ‘dailies grade’ combined with a corresponding camera LUT. Gives the best of both worlds, especially when some slight colour changes are needed in editing.
Would you care to elaborate on what the process is when you ‘apply the CDL’ during an ALE import?
Best,
Christopher
-
Christopher Reig
June 2, 2014 at 12:58 am in reply to: Ingesting offline media – ALE, AAF, thoughts?Hi Michael,
One of my greatest ‘pipe dream’ wishes for MC is to be able to perform the sync in another dailies tool, carry the audio TC (and, if possible, slip offset), and have MC ‘point’ to the original BWF, creating sync clips based on this metadata. A kind of automation, if you will. But that’s just fantasy-talk.
I will definitely have a read of your blog! I just skimmed it now, and some very informative articles.
Best,
Chris
-
Christopher Reig
June 1, 2014 at 10:01 pm in reply to: Ingesting offline media – ALE, AAF, thoughts?Hello Michael and Pat,
Thank you both for your detailed replies. It makes much more sense to me now.
Michael, I very much know of the problematic linking of audio tracks you speak of with an ALE. Generally speaking, I usually prepare vision-only DNxHD 36 MXF media for offline editing. I have always preferred to sync up the audio in Media Composer itself, bringing in the BWAV files from the sound recordist into their own bin for the day. It is very quick to create sync clips and strip out all but the mix track, and slip to perf as required.
I do also sync in my dailies tools for other deliverables, but I have always been cautious of doing this for the editorial media. It does send the audio TC and sound roll information via the ALE, but without knowing more about how this will affect an audio conform later on, I have always ebbed on the side of caution and done it in Media Composer.
Pat, a very interesting note on the finishing side. For episodic TV, usually they conform the camera originals in a Baselight or Resolve suite, perform the colour timing, and then round-trip the mastered DNxHD media back to Media Composer / Symphony for the final output with audio. For most of the film projects I work on, I am actually not sure what they finish on, but I know it’s not often Avid.
Thank you again for your advice and wisdom. It is greatly appreciated.
Best wishes,
Christopher
-
Hi Andrew,
I know this thread is quite old now, but thought I’d chime in my two-cents if you (or others) were still interested, having used both platforms to regularly produce dailies.
Colorfront ExD / OSD are both more geared toward dailies, and just dailies, so they offer quite a few things over Resolve 10 that are very much indispensable when doing dailies. The first being audio sync.
The physical interface for syncing the audio is much more intuitive, especially when you encounter no-timecode situations, or have a lot of drift to correct. Furthermore, it allows you slip in 1/4 frame increments, which is really useful. It also allows you to performs some wonderful little tricks, like being able to attach two audio clips to a single shot (you’d be surprised how often I hear ‘wait, cut…… no, wait, actually, hold the roll, and end up with a single camera take, with two audio clips to match). Small things, but really quite useful for dailies.
The second thing is the whole ‘workflow’ for logging metadata, creating notes, and finally rendering the files. And this is a big one. Different deliverables often carry different naming conventions. Whilst editorial may prefer the source clip name for their MXF / ProRes media, the production team will almost certainly want it organised by scene-take. Being able to specify how to name each deliverable is a huge time saver. Plus, the reporting system isn’t too bad, either. One thing I will hand to Resolve, is the media page is quite good at entering shot information, especially en mass.
(Slightly off-note, Assimilate Scratch has THE best interface for this kind of stuff. It’s actually my tool of choice for dailies. But that’s a whole other thread).
Lastly, I would say the rendering system is fantastic as well. They both offer great visual quality, but where Resolve falls down is the simultaneous render thing. You can certainly create additional outputs in the deliver page, but only at the same frame-size as the primary render. This certainly can work, but say you want 1080 for the editorial team, and 720 for iPads, and 480 for web-delivery, you are out of luck. You will need to create additional render outputs at each frame size. Also the case for aspect ratio blanking. Colorfront / Scratch are far more configurable in these aspects.
Now, with all that being said, you can still perform the same functions in Resolve, but it does take a little more work(arounds). You can use an ALE of the timeline to create reports in Excel / Word, and create scripts or text files to automate the renaming of your output. You can also create additional timelines with and without masking for each deliverable, etc. So you can certainly use it for dailies, as long as you can work with or around these limitations.
I am quite hopeful that Resolve will bit-by-bit implement these sorts of things. I mean, they have done an amazing job, when you consider what Resolve was 4 years ago, and what it is now. But as of right now, this is where I feel it stands now from a dailies perspective.
Hope this information comes as a help, to you or someone reading this thread eons later. I would love to hear what you ended up going with in the end, and how it has worked out for you.
All the best!
Christopher
-
Hello Sascha,
I am glad to hear that you got the project out okay. It’s funny you should mention shooting the entire piece at 50FPS. I am currently working on an action film, and we are shooting quite a lot of our VFX wire-removal shots at 44 and 48 FPS for samples, speeding back up in Media Composer.
If you are shooting in raw-format on your F5 / F55 / F65, the image capture metadata can be manipulated in post production, such as ISO, white balance, etc. If you are shooting in the XAVC / HDCAM SStP formats, it will not have these functions. But you certainly don’t need the raw format to produce great images, especially given that you can capture SLog to all formats, and countless shows I have worked on have bypassed raw completely. And you know what, they looked great.
I had a quick look at some support documents for DaVinci Resolve. It can very much support either the XAVC or HDCAM SStP formats that the Sony F5 shoots on-board. I also found a great chart in the Resolve manual, detailing which effects Resolve can import via AAF from Media Composer. Included among them are position, scale and rotation, as well as linear and variable speed changes. Could be worth noting for the future, if you decide to go the conform route.
Now that I think of it, Sareesh Sudhakaran over at WolfCrow.com wrote up a great piece on the Avid to Resolve workflow, which I details quite a bit of these processes. I’ll paste a link below.
Best wishes,
Christopher
https://wolfcrow.com/blog/the-avid-to-resolve-workflow/
-
[Marc Wielage] “It helps to have dozens and dozens of people in the background helping. The issue of post deadlines and edit changes have never been worse than they are right now, and I don’t see this problem getting any better. Clients have a bad tendency to have unrealistic expectations, particularly when presenting 100 changes in a single reel that have to be fixed in a few hours. It’s our job to rise to the challenge and fix them as much as humanly possible.”
You are so right about that. Not long ago on a show I worked on for NBCUniversal, they were re-shooting an entire scene about 2 days before it went to air. In a different country. Was a busy two days.
As far as feature films go, every film is it’s own beast. Some of them are cut, color and out it goes. Others (especially VFX-heavy films) have VFX shots coming into the mix right up to the last minute. For the trailers, as Marc said, it’s a pool of shots approved for marketing, cut a million different ways.
-
MC 7.0.3 works on nMP 10.9.2, both with an AJA Kona 3G and Decklink 4K. About all I have to contribute on that one.
-
Hi Sascha,
From my experience (and others here will likely have more), it usually comes down to two routes.
The first is to conform the edit in your coloring system to the camera originals, and send the graded shots back to Avid.
The second is to create a ‘baked master’ inside of Media Composer, exporting to a high quality format (DNxHD 220X / ProRes 422 HQ for primarily 4:2:2 sources, or Avid DNxHD 444 / Apple ProRes 4444 for 4:4:4 primary sources). Then, using an EDL as a guide, you then splice it up in the color suite, grade it, and send the program back to Avid.
The advantage of the first option is that you can obviously grade from the camera originals, allowing you access to things like camera raw control. The primary disadvantage is that many effects don’t translate from NLE to color system, so you usually have to remove most effects before sending an AAF, grade it / render it, then re-apply them in the NLE to the graded footage.
The advantage of the ‘baked master’ is that your effects are all applied as-is, and that you are only sending what you need to the color system, mostly ready to go. As a result, however, this may make certain grades trickier, and all transitions are baked in, without a ‘handle’ on either side. So you need to take a little extra care, doing things like keyframing transitions.
What kind of format did the camera acquire in? There may be an advantage to doing the re-framing and scaling in your color system, but if your mastering to a 720P output, there may not be. All depends.
Please feel free to hit me back or PM me. Always happy to share some advice.
Best wishes, and good luck with your project!
Christopher