[John Rofrano] “Our Samsung HDTV looks NOTHING like our Sony HDTV (not even close!). That’s the “state of the art” when it comes to TV. You’re not going to be able to fix that.
What you should do is buy a hardware calibrator like a Spyder4Elite and calibrate your monitors and show your client that it looks correct on your monitor. “
Ah ha! My original problem was that when I rendered a file it would increase the contrast and I would loose detail in the dark areas OR SO I THOUGHT. In Vegas 13 all would look fine in the preview window and the second preview monitor (both Dell U2414H) but once rendered and played in Windows 10 “Films & TV” or Quick Time Player, it would appear darker. I then applied the suggestion [Thomas Felton] “The best thing is to apply the Levels filter and select computer rgb to studio rgb” and I thought I had cracked it but then quickly realised I hadn’t.
When I took an original untouched file dragged out of the camera and the same file clean rendered in Vegas and played it on our calibrated 4K Samsung TV they looked just the same. The file rendered with the Levels filter now looked too bright.
So the Red Herring for me was that when I played an untouched file (dragged and dropped straight out of the camera) in Windows 10 “Films & TV” or Quick Time Player, it would play PERFECT, exactly the same as it would in Vegas BUT, once I had rendered the file it would then look darker in those same players, leading me to think the problem was in the render.
I confirmed this wasn’t the case by bringing the rendered file onto the time line in Vegas next to the original and they played exactly the same in the preview monitor.
I now realise that if I want to check the final render I have to do it within Vegas (or on my calibrated TV) but not on the Windows players I have.
I can sleep now but I am left scratching my head as to why those players play the original file perfect????
Vince