-
Dirk Wellekens
July 22, 2018 at 7:46 amOne final thought: it feels a bit odd that the preferred way to film with a modern camcorder (for this particular case) is based on a method (interlaced) that is actually obsolete and no longer used in modern playback (tv’s, computer screens,…). If the interlaced mode can be considered as such an intelligent/efficient compression mode, why was it abandoned and is it no longer an option, while it nevertheless remains one of the options in state-of-the-art camcorders. Any comments about that?
-
John Rofrano
July 22, 2018 at 7:54 pm[Dirk Wellekens] “it feels a bit odd that the preferred way to film with a modern camcorder (for this particular case) is based on a method (interlaced) that is actually obsolete and no longer used in modern playback (tv’s, computer screens,…).”
Most people shoot progressive these days because there is no reason to shoot interlaced now that modern TV sets support progressive display (as long as you own a modern TV and are sure that everyone who wants to watch your video has a modern TV). So I wouldn’t say that interlaced is the preferred way. It’s just very popular because it’s been around a long time and is widely supported. (kind of… if it ain’t broke, why fix it?)
As I said, 50i is smoother than 25p even though they are both 25 fps. If you want to shoot progressive and get the same smoothness, shoot 50p.
[Dirk Wellekens] “If the interlaced mode can be considered as such an intelligent/efficient compression mode, why was it abandoned and is it no longer an option”
Interlacing is not compatible with digital computer screens where “computer” includes cell phones, tablets, and just about any display that isn’t a TV. If you are shooting for delivery on YouTube then you should be shooting progressive. If you’re shooting for broadcast TV then interlaced is more widely compatible.
[Dirk Wellekens] “… while it nevertheless remains one of the options in state-of-the-art camcorders. Any comments about that?”
Two words:
Backward compatibility!
Cameras still support it for the same reason that all cable TV providers still support SD stations even though the same station is broadcast in HD. There are lots of old TV sets still in use and the broadcast industry is very proud of backward compatibility. Color TV was 100% comparable with Black & White TV. No one had to buy new equipment. Compromises were made to ensure backward compatibility and we are still paying for those trade-offs today. Why did HDV use standard DV tape? Backward compatibility. They had to compromise the bit rate to fit on SD tape but no one wants to through out everything they have to move to something new.
I don’t own a 4K TV, I don’t own a 4K camera, and I could care less about 4K because my family still watches SD channels on our HD TV! (which are reduced to a postage stamp in the middle of the screen because they are both letter boxed and pillar boxed but that doesn’t seem to bother them ¯\_(ツ)_/¯) I bought a few Blu-ray discs until I realized that the kids can’t watch them in the car on our DVD player so I went back to buying DVD’s again and it’s really hard to tell the difference between Blu-ray and DVD on anything smaller than a 32″ TV. Now everyone is watching movies on a 6″ phone! Quality no longer matters at that point.
Every new TV must support interlacing but the reverse is not true… old TV’s do not support progressing and there is plenty of interlaced equipment still out there and camera manufacturers and broadcasters will be supporting it for many years to come. Just like the NTSC frame rate is 29.976 instead of 30 fps because of technology limitations from the 1920’s (almost 100 years ago!!!). Outdated compromises that will be with us forever because of the strong desire for backward compatibility.
~jr
http://www.johnrofrano.com
http://www.vasstsoftware.com
Reply to this Discussion! Login or Sign Up