Mike Most
Forum Replies Created
-
You might want to try replacing the GUI card with the ATI and see if that enables the system to see both Titans. And make sure you don’t have “use display GPU for compute” enabled in Preferences/Video IO and GPU..
-
Have to ask this: Are you running full Resolve (i.e., with a dongle) or Resolve Lite?
-
Mike Most
May 16, 2015 at 1:08 am in reply to: MC 8.3, Mac OS 10.9.5, Matrox MXO2 LE (v4.2) – no video outputDid you enable the “use hardware” button? (by default it lives on the right side of the button bar at the top of the timeline window). If that button has a red circle with a slash through it displayed, the hardware will not be used. Click it and make sure it’s enabled (i.e., no circle/slash) before concluding that your hardware is faulty.
-
Is there anything else in the expander? And do you happen to have a Red Rocket X card in this system?
-
All of these things are choices. Red chose to make the choice of color matrix part of their overall conversion routines, but that is in part because they also chose to make available color paths within that conversion setup that do not require any further manipulation, at least for Rec709 display. So they give you a choice of multiple color matrices (or none, which is what Camera RGB is) and a choice of multiple gamma curves that include Rec709 compatible output (Redgamma 1, 2, 3, 4, etc.) and one that puts you into log space (RedlogFilm) that is compatible with a pipeline that would normally include a LUT to get you to your chosen display gamma. Sony puts the color matrix in the LUT path, much the same as Arri. But both Sony and Arri offer LUTs that do not implement a color matrix, so that the user has both flexibility and responsibility in these things. Also, Arri allows you to apply a P3 targeted color matrix directly to LogC Prores files internally, but most users prefer not to do this and apply it as a post process (or not) instead.
As for Arri’s math, I have no direct knowledge of what they’re doing, but my guess is that it’s a fairly straightforward 3×3 matrix, developed with deep knowledge of the sensor characteristics and what their color scientists feel is the best representation of the captured color. This is supported by the fact that if you use their ACES IDT and look at the scene referred result through a Rec709 RRT/ODT, you get a very similar color pallette to that obtained from the Rec709 targeted LUT approach.
-
>>Arri’s implementation of log is somewhat unique in that it requires a Reference Rendering Transformation >>(in the form of a 3D LUT) to bring the colours up to the saturation of the original scene shot.
That is not unique, it is completely normal and common with nearly every
RAW format sourced from a Bayer pattern sensor. The color gamut of nearly all digital cinema cameras is much wider than Rec709, so a simple deBayer to Rec709 colorspace is going to yield an image with a very low saturation. Getting the saturation back to where it “should” be, and getting it correct across the color spectrum is best done by a color matrix operation, which is what the Rec709 color matrix in the Arri lut builder does. For Red, the Redcolor settings are in fact settings for the color matrix. For Sony F55 and F65, the supplied LUTs from Sony implement a color matrix appropriate to the colorspace you capture in (SGamut2 or SGamut3). My point is that the use of a color matrix to correct for saturation when displayed on a Rec709 display is normal and common across virtually all Bayer sensor digital cinema cameras. It is not “unique” to the Alexa.And just to use the terminology correctly, the term Reference Rendering Transform is usually used when talking about ACES, whose developers invented that term to refer to a specific combination of lookup tables and matrix transforms that are used to yield an “idealized” image from values stored in ACES space. A color matrix is a color matrix. The Reference Rendering Transform is something else that is much more complex than that.
-
Saturation, particularly blue saturation, will not be correct. That’s what the Rec709 color matrix in the Arri LUTs made using the “Rec709” setting provide, and what many users don’t seem to understand. The purpose of the LUT is not just implementing a proper gamma curve, it is also applying a saturation matrix to yield proper color saturation across the color spectrum. Simply turning up the saturation control in a color grading program does not provide the same thing, although some seem to prefer that result on some material.
-
File names, folder numbers, and the like mean nothing to Avid. They’re present for you to be able to know what you have, but Avid doesn’t care about any of those things. All it cares about is the time code and tape name in the MXF headers. Given the editors’ concerns, you’d probably be better off importing without the original sizing and not using an Avid roundtrip workflow. Just make sure you’ve selected the proper criteria for retaining original file names as the tape name (i.e., in the Conform Options section on the general setup page, choose assist using Source File Name, assuming that’s what they used in the first place) and render individual clips with handles. The editors should be able to relink any existing sequence by using Relink to Selected. They’ll need to import your clips (they can just drag the .mdb file that Avid creates from the numbered folder to a new bin), select all of those clips, select the sequence, and relink using tape and time code. That way your media is replacing their original media in the same timeline, so everything will be applied as it was originally. You will need to make sure that you’re not applying any effects in your Resolve project for this to work.
-
>>There is nothing proprietary in FilmLight’s approach…
Well, no, not if you’re referring to the use of standard XML tags. But the approach is completely proprietary in the sense that the file cannot be properly interpreted and the values properly applied without the use of a Baselight or a Filmlight Baselight Editions plugin.
I think what’s being overlooked in this discussion is that the use of ACES virtually ensures that the data being presented to the VFX artist is already in scene referred, linear light space. That is usually preferred in compositing because it behaves in a similar way to real world physics, modeling the properties of “real” light. It also operates in a floating point space, ensuring that no clipping takes place through the various compositing operations. ACES is by design very VFX friendly.
-
Mike Most
February 16, 2015 at 5:52 pm in reply to: Kill me please – GPU memory is full or reduce number of correctors“I’m doing it for free” doesn’t change the technical requirements. They are what they are and your system doesn’t meet them. That’s it.