[Kelly Griffin] “When I got my new Vegas setup I took it as a good time to also change my final output monitoring from my Sony PVM (CRT) monitor to a very nice LG LED LCD monitor; my thinking was that (A) I don’t think CRTs are even SOLD anymore, and (B) I might as well see what people at home see.”
That’s a good argument for previewing on a television. But you may find differences between television and monitor anyway, just as you’ll find difference between different televisions. As well, you’re hopefully working on a color corrected monitor for any serious work, while the vast majority of people with televisions don’t even do a proper set up of the display (at the very least, any new TV should be calibrated using a calibration DVD or similar).
[Kelly Griffin] “Anyway, I get a beautiful picture on my LED LCD monitor, but I’m working on some SD spots using 1920×1080, 24p footage. Some of it looks just fine, other parts, especially a fair amount of motion, looks as if it’s aliased or reverse field order or something. I thought I understood, maybe incorrectly, that LCDs don’t even see fields, so I don’t even know if that’s what I’m seeing.”
LCD devices “see” fields, but they de-interlace them. They have to… an LCD display is a progressive device, no possibility of actually displaying interlaced video.
But you can get weird artifacts, depending on how you’re hooked to the display. For one, consider this. A normal computer display is showing a section of computer video memory on your display. That computer video memory is displayed at whatever sync rate you have set up in your display preferences, with a digital monitor usually something negotiated with the display. So now, you start a video. That video player is dumping the decoded video, frame by frame, into your PC’s display buffer… but that rate of update doesn’t necessarily have any relation to the outgoing display rate. And at this point, it’s your display program (Windows Media Player, Vegas, etc) that’s doing the de-interlacing, not even the monitor hardware. Even without perfect sync to display, there are better and worse ways of doing this.
Some display programs will actually drive your monitor directly from the video, at least within limits. I believe this is what’s supposed to happen when you use “Video Preview on External Monitor” function in Vegas. If this is really doing that, then you’d see 60i or 24p video actually going to your monitor in those modes, rather than 60p or whatever. When you set this up in the video preferences, you can also have Vegas do the de-interlacing for you, depending on the kind of device you’re sending to. I used this in the old days to send video to analog monitors and DV camcorder screens, with good results.
If you have interlaced video input, you are going to see some kind of de-interlacing artifacts on a progressive display. One of the classic example is “mice teeth”, which is a pretty obvious interlacing effect you’ll see when there’s a great deal of movement from field to field. You can also get even stranger interlacing effects by resizing interlaced clips with an algorithm that’s not interlacing-savvy. Vegas wants to de-interlace before resizing… if you resize an interlaced clip without selecting a de-interlacing method in your project preferences, you can get some evil video. There’s a pretty explanation (with images) of interlacing effects here:
https://www.100fps.com/
De-interlacing algorithms trade off resolution to fix the interlacing artifacts. There are several options in Vegas, and others not in Vegas… working in digital video, I’ve written a few myself over the years.
Of course, if you’re trying to keep your video 60i for Blu-ray or DVD, you may not actually want it de-interlaced. That may hold true for simple manipulations, but once you start resizing, you want to deinterlace.
And of course, such video is going to be deinterlaced by a progressive display, either as a matter of course of being written into PC screen memory as a window on your screen, or by your LCD or DLP (neither of which can support interlacing) display’s digital television chip (same thing that upscales 480p or 720p to 1080p on a full HD TV).
-Dave