Forum Replies Created

  • Yes, sorry for the last question…
    I mean XYZ, not DCI-P3…
    sorry, my brain is tired !!

    Well…. thanks a lot for all those explanations Bram !
    It was really helpful. ; )

    (And I promise I’ll stop with my display port ramblings !!)

  • Nooooo !!! You’re a great teacher !! Really !!!
    But… like I said, we can’t learn chinese in few days !
    I’m doing editing for almost 15 years, but I know I’m really far from the absolute knowledge.

    Bluefish444 is the only manufacturer I found which is doing 12bit I/O cards.
    And BMD just release the Decklink 4K Extreme in 10bit, so we’ll have to wait I think.
    Maybe AJA, with a new Kona…

    So, to summarize, I just have to build my computer, and… wait for a 12bit I/O card ! ;D
    I’ll continue with my REC.709 until that..

    To finish, just a question about your monitors.
    I’m really interested by the CM-170S, but you said the XYZ is beta,
    so can you confirm me (even in approximate % if you prefer !)
    it will be enough accurate for use in color grading ?
    With a 12bit I/O card of course !! 😉

    And by curiosity, why your monitor have a video processing in 12bits, but only display in 10bits ?

  • “I know you are looking for ways ‘around’ using the pro I/O card”
    Yes, previously, I asked your help because I needed to buy a laptop
    (without thunderbolt), and of course I couldn’t add an I/O card.
    But, now, I try to build a dedicated computer for color correction.
    So, to be fully honest, actually the problem for me is not really the I/O card
    (which are not really expensive as you point), but more the display….
    even if I must admit $3295 is really cheap for a quality monitor like the CM-170W ! ; )
    But that’s not really the subject of this debate.

    “If you are color grading in Resolve you are not going to be monitoring an XYZ signal, your output will be RGB.”

    Excuse me to insist on that, but I carefully (I hope !) read the links you gave,
    and it’s clearly stated that the XYZ cannot be use as it, because XYZ color doesn’t exist,
    and will have to be translate into RGB coordinates for a monitor display.
    So… in any grading software, if all output must be on RGB,
    and nothing can be really done in XYZ / 12bit, why Scratch said it will output in XYZ
    and Resolve can’t, and a I/O card like the AJA Kona can only output 10bit in 3G SDI ??!
    Or are we talking about “another” RGB, and I’m wrong ?

    Ok, so… more prosaically… what card and what software
    can grade in XYZ in 12bit ? (And I can start saving…)

    “are you actually going to be grading for digital cinema a majority of the time?”
    It will be nice… but as director I don’t make feature every week… ; )
    So, yes, digital cinema only… but unfortunately not a majority of the time.
    Of course I didn’t allow myself to bother everyone with the XYZ
    if Rec.709 could satisfy me for every day production. ; )

  • First, thanks a lot Bram to have corrected all my mistake.

    And could you clarify one important point…
    the CIE1931, and the XYZ are the same thing or not ??
    Or is the DCI-P3 is a sub-colorspace inside the XYZ colorspace,
    which is at the end a sub-colorpace of the “ultimate” CIE1931 ?

    CIE1931 –> XYZ –> DCI-P3 ???

    And if the XYZ colorspace is the summum, how much % of that is the DCI-P3 ?
    So will it be possible to have a “higher” DCI-P4 coming next ?

    Just few more questions :

    “To the best of my knowledge the Mac OS still does not support 10 bit display port output”

    Just by curiosity, when Mac or Windows will be able to output 10bit, does it mean it will be possible to use only display-port for output ?
    I know the card are important for lots of things, but in this case of only monitoring precise colorspace, do you think one day it could be done without all those side roads ?

    But for now, with current monitors, the signal will be better directly with decklink card and SDI input,
    or with display port input (or HDMI) through decklink card and HDlink converter ?

    “They have and this was the specific aim of DCI’s use of XYZ color space and signal coding.”
    That’s great to know that obsolence will be obsolete (like Jim Jannard say every 10 minutes !)…
    but if DCI is the final goal for grading, why Sony (and others)
    continue to invent new colorspace like xvYCC ?
    ( https://www.sony.net/SonyInfo/technology/technology/theme/xvycc_01.html )
    I know it’s marketing stuff, but as you can see (figure 4.3), their Sony GxL laser projector can output 97% of the Munsell color system !
    And RED will also release a laser projector (REDRAY) soon…
    so their colorspace will be higher than the DCI-P3, right ?

    “When we are talking about three-primary RGB display devices the largest color space triangle you can carve out of CIE1931 is always going to be smaller than the entire CIE1931 color space.”

    Thanks for the clarification ! I didn’t understood how it was possible to get a display higher than that !! Ultraviolet television for robots ! 😉

    So… the most (dumb) important question (for me) :
    DCI-P3 is clearly the way to go, and it will stay for the next decades ?

    And thanks for this great Omnitek’s link.
    I tried to read before the full DCI spec… but….
    I don’t work for FSI, so… it’s more or less chinese for me ! 😉
    But I try to learn fast !!

  • I want to clarifying I already know what is the purpose of an output card like Declinck,
    and it’s necessary to output only the colorspace (rec.709, rec.601, DCI-P3…)
    of the footage, without the operating system interfering with his own color environment.

    But it is not possible to just use some LUT, or software tricks to allow only that exact and precise color space throught a regular 10bit display port by example ?
    And without using more “useless” card and hardware ??

    And about the color grading more generally, since the beginning of my career, I saw so many new format and color space, that the fact DCI-P3 gamut is “only” the emulation of a Xenon bulb, doesn’t make me confident…
    soon we will use xvYCC, or Rec.2020, or I don’t know what else…
    and, I’m not George Lucas… I don’t want to fully color grade again my features every 5 years !! 😉

    So, why not trying to make a good and definitive colorspace and working environment ?
    If the only color a human can see are only in the CIE1931 XYZ color space, so why not just grade in this environment, and after only make conversion, like we do when encoding our master feature in DVD, web, etc.
    Now, the monitor with their huge xvYCC (in example) can allow more than CIE1931 if I understood well… so, why not ?

    (I heard about the ACES, but I’m not really sure this will be the same approach.)

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy