[Eric Barker] “Can someone give a definitive answer as to what LCD TVs are really displaying to the viewer?
A) old-fashioned upper/lower, on/off combing (like CRTs)
B) 60th-of-a-second refreshing, but with each field remaining up for a 30th? (like I described above)
C) 30th-of-a-second refreshing, but with both fields going through a hardware deinterlace process (so what we’re really seeing is technically 30p).
“
It’s actually none of the above.
LCD and plasma displays are inherently progressive display technologies, which refresh the entire screen all at once rather than in alternating fields, like CRTs.
When given 1080i60 source material, circuitry in modern plasma and LCD displays perform realtime motion-adaptive deinterlacing on the fly, essentially converting 1080i60 into 1080p60. Of course it’s not as crisp as true 1080p60 would be, but the fast refresh rate increases the perceived resolution. It’s also why 720p60 is perceived as being as sharp as 1080i60. I’m not sure if it’s still the case, but ESPN actually used to broadcast in 720p60.
You can see a similar effect in practice when watching a DVD on a computer. Most DVD playback software (and definitely Apple’s DVD Player software) perform realtime motion-adaptive deinterlacing on interlaced SD footage, so what you end up seeing on your computer screen has the same temporal resolution as watching interlaced SD on a CRT. The software is actually converting 29.97 interlaced into 60p on the fly. That’s why DVD footage that originated as either NTSC 29.97fps interlaced (60 fields per second) or PAL 25fps interlaced (50 fields per second) will retain the “live video” look on playback on a computer screen rather than looking like 30p or 25p. An exception would be the web. It’s extremely rare to see any online video with a frame rate higher than 30p, although it’s technically possible to stream 60p.
As far as 24p in the broadcast TV world goes, 1080i60 will contain 3:2 pulldown with interlacing (exactly like NTSC SD), and 720p60 will contain 3:2 pulldown but using duplicate whole progressive frames instead of split interlaced frames. As far as what ends up being displayed on an LCD or plasma from a broadcast signal, it will either be 1080p60 (synthesized from 1080i60) or 720p60, each containing a pulldown cadence of duplicated frames. It can be illustrated using letters as sequential frames thusly:
AA BB CC DDD EEE (repeat)
Where things get more complicated is if you’re playing back true 24p source material from a DVD or Blu-ray player connected to an LCD or plasma. In that case, the footage can be displayed at its native progressive frame rate without pulldown being added. The refresh rate of the screen will vary depending on the model of TV. It might be refreshed at 24Hz, 48Hz, 72Hz or higher, but the frame rate doesn’t change – you’re just seeing the same frame refreshed at a multiple of the base frequency.
An exception to this is what is in my opinion the completely awful “smooth motion” 120Hz processing in many modern LCD and plasma displays that synthesizes new interpolated frames between existing frames, thereby making everything have the motion quality of live video. It’s the first thing I turn off in the menu settings of any TV I come into contact with. You’d probably not be surprised to learn that I absolutely hate the look of the 48fps HFR versions of part 1 and 2 of The Hobbit. The difference in temporal resolution results in a very different psychological experience for me, which I liken to watching an episode of a soap opera versus experiencing a cinematic masterpiece. For me, higher frames rates have the look of something that is happening, versus 24p looking like something that happened. I find the 24p look to be much more complimentary to dramatic material, as it helps to provide a sense of detachment from reality.
Hope this helps.