Creative Communities of the World Forums

The peer to peer support community for media production professionals.

Activity Forums Apple Final Cut Pro Best way to monitor with FCPX–what’s your take

  • Jack Guthrey

    July 2, 2012 at 4:57 pm

    Of course, Tim makes a great point.

    The best monitoring I ever had was waaayyy back in High School when I used this terrible little CRT TV. It just happened to be the exact same TV all the classrooms at the school (which I was producing for) had. I always knew exactly roughly (again, crappy CRTs) what my audience would see (unless teachers turned it off).

    Anyway, watch on what your audience watches on.

    Jack Guthrey
    Carolinas Account Representative
    Marshall Graphics Systems

  • Michael Hadley

    July 2, 2012 at 5:08 pm

    Right. Broadcast is not the issue since that is not our deliverable.

    And we’re lucky because our clients don’t visit our editing suite–we do production as well so we just post cuts on on a client web page we create.

    I guess we are indeed down the rabbit hole in terms of standards. I guess my point is it helpful to start with monitoring in broadcast workflow set up (via MXO or Blakcmagic, etc) or is that really beside the point.

    Bear in mind that of course we use (software) scopes to check black levels, white levels, vectorscope for skins tones, etc. Then do our style grade as we’d like to.

    But bloody hell, finding the best way to reference is like quicksand.

    I will say this:

    We shoot a lot these days with the Sony F3 and Canon c300 (both great). When I monitor live on set via an SDI in on a 17″ Panasonic monitor it does indeed look like just what I get out on the back end looking at the image on a ACD pumped right of FCPX.

    Of course, I’ve not done a real side-by-side comparison so it’s a bit from memory.

  • Jeremy Garchow

    July 2, 2012 at 5:25 pm

    I use broadcast monitoring as the utmost in reference.

    It is the only way that I can assure myself that what I am seeing and what I am delivering is accurate, even if it’s a web delivery. If it’s inaccurate on the broadcast monitor, those inaccuracies will generally follow through the entire project.

    Of course, I have to sometimes deinterlace for the web, but I can at least ensure that everything is looking correctly before that final step, and the best place to do that for me, is through baseband video.

    Your experiences may be different,

    Jeremy

  • Tim Wilson

    July 2, 2012 at 7:57 pm

    I’m a big fan of reference monitoring, and no matter what your output is going to be, you should be working at the highest resolution you can, in the best conditions….

    …but there’s a long way between Dreamcolor and Dolby (which my phone just tried correcting to “Dobby”). If it’s web, how calibrated do you need to be beyond a big, good computer monitor? I’m asking because I don’t know. Do you need REC XXX monitoring for computer RGB output?

    And me being a dingus about more colors available on a PC than a Mac notwithstanding, you do need to take into account your computer colorspace vs. the viewer’s, and gamma maybe even moreso as you evaluate platform differences.

    and to Oliver’s question, the projector used for a boardroom is different than for an auditorium our theater, could be in different colorspaces, etc.

    So I don’t mean to oversimplify when I say, hmm, maybe a good computer monitor can get ‘er done for video being watched on a computer, but I sympathize with wanting to avoid the rabbit hole.

    Tim Wilson
    Vice President, Editor-in-Chief
    Creative COW Magazine
    Twitter: timdoubleyou

    The typos here are most likely because I’m, a) typing this on my phone; and b) an idiot.

  • Jeremy Garchow

    July 2, 2012 at 8:16 pm

    [Tim Wilson] “If it’s web, how calibrated do you need to be beyond a big, good computer monitor? I’m asking because I don’t know. Do you need REC XXX monitoring for computer RGB output? “

    REC XXX? You will most likely watch that on a computer monitor these days. 😉

    It’s more about how you shoot it. Most likely, you shot 709 HD, so why not monitor in 709 HD?

    There are more colors available on a computer monitor signal, yes, but it’s more about the black and white levels in video as opposed to computers. They are different.

    Using baseband video eliminates many variances in a pure computer based pipeline. For one, it’s hard to calibrate a computer monitor to 709 video at the proper black and white level as many GPUs don’t support it.

    If you have interlacing in your program, monitoring on just a computer monitor will not show the what is really going on as many times you are monitoring the signal at less than 100% of the video pixel size.

    It is getting easier. The AJA TTap and a Thunderbolt enabled Mac will make this super fall down easy (and accurate), and it will be available for those TBolt PCs we have been hearing about.

    As always, after it leaves the shop, there’s nothing you can do about it. People might have their computer monitors set to some weird monitor profile and it will be off, but there’s nothing new in that regard.

    Jeremy

  • Bill Davis

    July 3, 2012 at 5:07 am

    Not sure if it’s instructive, but in my early career in radio – it was SOP in every recording studio I worked in to switch OFF the big full-range monitors before you finalized any radio spot and routed your fresh recording to a pair of semi-crappy Auratone cube speakers on the recording console bridge in order to hear what your work would sound like in the real world of car and crappy transistor radios.

    This discussion is largely about the same thing.

    Monitoring at some point as close to the listening (or viewing!) experience of your audience as possible is, IMO, always a smart thing.

    That said, I’ve come to trust my X system enough that I don’t worry nearly as much as I did back in my analog days. My monitoring may not be perfect – but the X scopes are good enough to show me that my work isn’t going out so far out of accurate as to be noticeable.

    Particularly since the more modern corporate brand books my clients are sending out all spec things like logo colors in RGB values – and I’m coming to believe that if I reflect those values correctly on my end, the digital realm should provide a transmitter translation that I can functionally trust.

    Then it’s just up to me not to “correct” finals so much whilst chasing a “look” that I screw those values up.

    FWIW.

    “Before speaking out ask yourself whether your words are true, whether they are respectful and whether they are needed in our civil discussions.”-Justice O’Connor

  • T. Payton

    July 3, 2012 at 5:51 am

    [Bill Davis] “Auratone cube speakers on the recording console bridge”

    That is just what I have in my studio! In fact when I mix audio, I get it sounding good on my JBLs and the Auratones and then I put a copy on my iPhone and listen to it with my $9 Target earbuds. Only when I listen to it on my earbuds and then subsequently in my car does it really sound “real” to me. It is an interesting phenomenon because I “know” how those little earbud react and what a good mix sounds like on those car speakers.

    It is interesting that a majority of the time I spend in my studio I am listening to my own mixes and photos, but not the mixes or shooting of others, and frankly it shouldn’t be that way. I think it is critical to choose a monitoring satiation whether it be audio or video that you know how it will respond because you are used to hearing or looking at good mixed or graded material on it.

    I had a friend at a recording studio who ever time I visited him was listening to well mixed songs. When he had to setup and fiddle with equipment he would be listening. Then when he was actually working, his ears were well trained to know what he was aiming for.

    Something I used to do years ago when I did a lot of broadcast spots is record several breaks from the show that our spot was going to be running on. Then I would edit in our spot in the break, make a VHS of it and bring it to my conference room TV and home TV. Then I would sit and watch a bit of the show with the other spots in context and get a bit “lost” in the show and then all of the sudden I would see one of my spots. Sometimes my spot would look horrible other times I would be delighted it fit in so well.

    Also, don’t discount the benefit of having another large LCD TV in your studio so you push back from your editing chair and experience your edit like your viewers will.

    ——
    T. Payton
    OneCreative, Albuquerque

  • Tim Wilson

    July 3, 2012 at 5:52 am

    I haven’t mentioned this, but if I ever saw anyone monitoring broadcast or feature footage on a computer monitor, I’d take a bat to it. Nonononono.

    But we’re not talking about best practices for those kind of settings. The question seemed simple to me: my viewers will be seeing my work on a computer, and only on a computer…or a phone I guess. Is a computer monitor okay for this?

    [Jeremy Garchow] “Most likely, you shot 709 HD, so why not monitor in 709 HD?

    Because my audience isn’t seeing 709 and never will. You’re talking about an abolute, which is fine, but I think relative is far more important because….

    ….it’s hard to calibrate a computer monitor to 709 video….

    Which is my point for Michael. He’s delivering to a computer monitor, which doesn’t support 709. He needs to monitor at something closer to what his viewers are overwhelmingly most likely to see – web movies delivered to a computer or phone.

    If you have interlacing in your program, monitoring on just a computer monitor will not show the what is really going on as many times you are monitoring the signal at less than 100% of the video pixel size.”

    Another question I don’t know the answer to: I know that viewing interlaced footage on a progressive monitor is begging for disaster, but what if you’re seeing deinterlaced video on a progressive monitor, when ALL of the people seeing your video is also viewing on a progressive monitor? That has to happen at some point in the grading process. What’s wrong with making that first step and monitoring in a way that matches the final output?

  • Jeremy Garchow

    July 3, 2012 at 3:22 pm

    [Tim Wilson] “But we’re not talking about best practices for those kind of settings. The question seemed simple to me: my viewers will be seeing my work on a computer, and only on a computer…or a phone I guess. Is a computer monitor okay for this?”

    I guess I read it differently, I read it as, “I have a Matrox and an LCD monitor, and FCPX. I read about Color Sync on Philip’s website. What’s working?”

    If you have the matrix and monitor, why not use it?

    It will give you the most accurate representation of what you shot.

    Monitoring on just your NLE screen is not monitoring the final web deliverable. You are monitoring a lo res proxy of your scaled footage. It’s not deinterlaced, it’s not scaling at a Hugh quality and a ton of information is getting thrown out.

    If you were somehow able to do a real time encode you to a computer monitor using the exact compression setting you are going to use for your web deliverable (or worse, the compression of the compression when sending to YouTube et al), then and only then will you truly be seeing the final deliverable.

    He has the gear, stay above ground, let the hounds chase the rabbits down the hole, we stay up top to smoke peace pipes and enjoy warm ales to fight off the scurvy.

    [Tim Wilson] “Another question I don’t know the answer to: I know that viewing interlaced footage on a progressive monitor is begging for disaster, but what if you’re seeing deinterlaced video on a progressive monitor, when ALL of the people seeing your video is also viewing on a progressive monitor? That has to happen at some point in the grading process. What’s wrong with making that first step and monitoring in a way that matches the final output?”

    So you are saying you’d like to deinterlace all your footage before editing?

    I guess you could do that. You could also color correct all your footage before you edit, and resize it down to 640×360 before editing. You could.

    I, personally, think it’s probably not the best use of resources.

    Yes, most likely people won’t see it in 709, but you shot in 709. It’s a recommendation (rec), not an absolute.

    Your computer NLE monitor is not showing your final deliverable either so why not keep control of what you can control?

  • Michael Hadley

    July 3, 2012 at 6:38 pm

    Well, in truth, in the past have used and ACD (which is indeed 1920×1080) and a nice Samsung which is full 1920×1080. So, no downscaling when monitoring. Used the Matrox with FCP7. But the original string was about the need for the Matrox when using X and a true 1920 monitor, based on the NEW WAY that X supposedly handles color space data.

    Agree that laptop monitor is NG for critical evaluation (although, anecdotally, it is hard to distinguish between the full res HD ACD.)

    Again, everything is QC’d via scopes.

    And again, anecdotally, I have to say what X gives back out of the box on the HD ACD looks DAMN CLOSE if not the same as what we see when we are shooting and monitoring out an SDI signal into a panasonic 17″ field monitor that’s been calibrated properly. In fact, it looks better/closer than what we see when we go through the Matrox chain and calibration.

    It’s a head spinner.

Page 2 of 4

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy