Forum Replies Created

Page 1 of 3
  • Fantastic! Thanks, Alex!

  • Forgot to check this post for a while over the holidays, so pardon my late reply.

    This is a fairly common workflow at many major post houses that my colleagues and I have worked at here in LA on network and cable TV shows. They’ve all had proprietary scripts and plugins that handled the process under the hood, however. Maybe your experience has been different in Iowa.

    Here are some links to other VFX artists and supervisors who speak about this exact workflow in various compositing applications:

    Bradley Friedman’s blog post on achieving this workflow in Nuke:
    https://www.fie.us/2014/08/30/grain-management-101/

    By the way, the reason Friedman wrote this blog post is because Nuke specifically has a plugin called F_Regrain for this exact feature — suggesting that it’s not wrong at all — but his script works even better than F_Regrain.

    A Lynda tutorial from Steve Wright that discusses this grain workflow among others:
    https://www.lynda.com/Nuke-tutorials/28-101-Smart-grain-management-workflows/450279/702995-4.html

    A Reddit discussion on various grain workflows including this one:
    https://www.reddit.com/r/vfx/comments/8fjnit/what_are_your_grain_workflows/

    We like to degrain all our plates so that, yes, green screen keys can be pulled more cleanly, but also to make our lives easier doing paint, cleanup, and any other compositing work that doesn’t involve 3D renders. If we’re using freeze-framed parts of a plate for roto/paint work, we don’t want to freeze-frame the grain in that area. We could degrain that single patch and then regrain it, but that would introduce artifacts at the edges of the patch that take a lot of time to finesse. So degraining the whole plate, doing the comp work, and re-graining the whole plate at the end is more efficient timewise.

    And time is really at the heart of this question. When you’ve got a couple hundred VFX shots to pump out in a week and a half, an automatic degrain-regrain plugin saves you an entire step of adding an Add Grain effect and tweaking the channels intensities and sizes per shot. It’s a huge time saver. We’ve found that Match Grain rarely replicates the grain to any level of accuracy, and a manually-tweaked Add Grain is more reliable… yet time consuming.

    And believe it or not, grain is something our clients love to nitpick, so we’ve got to be pretty accurate with it. We’ve had shots rejected for minor grain matching issues.

    So, let’s open up the question to others with experience in this area: anybody have any idea how to translate Bradley Friedman’s Nuke script (https://www.fie.us/2014/08/30/grain-management-101/) to After Effects?

  • Adhish Yajnik

    May 28, 2016 at 8:31 pm in reply to: Applying loopIn() to an existing expression

    Thanks Dan! That worked!

  • Adhish Yajnik

    September 8, 2013 at 10:11 pm in reply to: Randomize Mask Order

    Well, time to learn scripting then! 🙂

  • Even so, once the opaque shade of gray becomes transparent through the course of the animation, it blends with its non-opaque neighbors.

    I found a workaround that involved turning all my layers into 3D layers and just writing an expression that causes them to move behind the others 5 units on the Z axis once the position property starts animating the layer around. So I moved the fading boxes physically behind the opaque boxes to get that effect.

  • Thanks so much for the advice and info.

    I hadn’t realized how easy it was to work off the r3d files in davinci, which is why I thought I’d have to transcode regardless and possibly start with a primary pass in redcine-x.

    Chris, you mentioned that the repos will carry over… to davinci? And will they carry over from the 720 proxies on which they were applied to the 4k, 2k, and 1080 online media without being affected by the resolution difference?

    — Adhish

  • Adhish Yajnik

    August 3, 2011 at 1:54 am in reply to: Creating webcam effect with frame rates

    Worked like a charm! Thanks!

  • Adhish Yajnik

    October 2, 2010 at 6:28 pm in reply to: XDCAM workflow: Avid to After Effects and back

    Hi Chris,

    Thanks for giving me a straight answer.

    We are linking to AMA volumes in Avid 5 to ingest our footage from the BPAV folder generated on the SxS cards.

    I did indeed export from AE as a Prores in the Rec. 709 (16-235) color space, and then upon importing into Avid, I made sure that the MXF that Avid created used the appropriate XDCam preset and the 709 color space as well, and it was perfectly fine.

    And I set up my AE project to work in the 709 color space before even importing my raw footage into it, and then set it to export in the working space.

    To the rest of you, I’m sorry you’re not running our universities these days. I’ve gotten used to dealing with the bureaucracy by learning about workflows and compression and such myself and I guess I’ll have an advantage over my classmates when we get out into the real world. For now, I just want to make my movie.

    Thanks again.

    ~ Adhish

  • Adhish Yajnik

    October 1, 2010 at 7:11 pm in reply to: XDCAM workflow: Avid to After Effects and back

    Don’t intend to green screen anything, so no worries there… but just so you know, I’ve managed to pull good keys from some pretty badly shot interlaced handycam footage using mostly tips and tricks from the Creative Cow tutorials. And little to no roto work.

    And this film is festival eligible, so if selected, we’re screening from an HD beta tape master. And then it will be seen by people outside of school.

    So yeah, I think I will knock myself out in my all HDV workflow, since we can’t all have the luxury of shooting with expensive cameras, editing on high end systems, and not having to deal with a little bureaucracy. Not even when we’re making blockbusters. But I’m sure you know that, right Dave?

    ~ Adhish

  • Adhish Yajnik

    October 1, 2010 at 6:29 pm in reply to: XDCAM workflow: Avid to After Effects and back

    Although I may be of above average intelligence in regards to the technological aspect of film and video, most of my classmates prefer to focus on the less rigorous artistic side of the filmmaking process. As a result, the university mandates certain project formats and settings so as not to cause large-scale problems on their Avid systems caused by students with little to no knowledge of stuff as simple as codecs and bitrates. With kind of ridiculous time constraints placed on us to shoot, edit, and do sound for our films, there isn’t a whole lot of time to run tons of tests with different codecs and such. We’re given settings and we have to stick to them, otherwise our films can’t be screened. Sorry to disappoint you about the hopeless state of education these days.

    And I just did a couple of comparison renders from AE using Apple Prores, Animation, XDCam HD (50Mb/s), and XDCam EX (35 Mb/s) codecs. There was no appreciable difference in image quality between any of these codecs. I guess Sony just has a pretty good idea of what they’re doing. So I think I’d be better off working with the least cumbersome filesize, generated by a bitrate of 35 Mb/s.

    I’ll bring all of these test files to the lab, see which is the smallest file that works in Avid, and then render the rest of my effects shots with the same settings.

    ~ Adhish

Page 1 of 3

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy