-
Thinking about “viewing accurate color, or at least gamma in FCP” post replies
Hello all, I’ve followed the thread re: “viewing accurate color, or at least gamma in FCP”, and it made me wonder about my own workflow and gear, specific to color correction…and the following occured to me:
Since CRT-based monitors *and TV sets* are quickly being phased out, and digital projectors are appearing more & more in theaters and seem poised to become the standard playback technology, if we continue using technology we’ve been using (crt-based monitors), instead of technology actually being used by our clients (and consumers)…to watch our cc’d media…why is that still deemed the “best” workflow?
IOW, we’re all correcting media files to look good when viewed on “old” technology that’s being phased out, and (shortly) won’t even be used by our own clients (and more importantly, consumers).
The word -futile, comes to mind; if we use a specific technology (crt’s)…for various reasons…all of them good (i.e., -to see maximum shadow/highlight detail and maximum color accuracy)…still, in the end, clients & consumers won’t even be able to see this level of quality anymore on their new LCD & plasma TV sets?
I know SED’s may end up becoming “THE” solution, but the original poster started me thinking about just how critical it makes sense to be, and what the poster inferred from his question, -that getting 85-90% of the way there, might just be “good enough”…for now, because that’s the limit of quality viewable on LCD’s & Plasma sets.
As many of us probably are, I’m a perfectionist, so absolute quality matters, absolutely, to me…and being a curmudgeon, I’ll not soon change my opinion or standards.
But that may be what the poster was really asking; and I find myself asking the same question as I ponder spending another $1200 on a 200-pound crt-based Sony KD-34XBR970 TV set… for client viewing.
Is anyone else having these thoughts too, or am I alone in trying to save a few bucks?