-
Video Compression vs RAW video data
This is a general question about digital video, posting here because I use Vegas, but am interested in hearing from some hacks/pros about this topic. i have been dipping my toes in the water of video editing and have some hands on experience however, I don’t feel like I have a real handle on understanding the standards of digital video.
From what I understand, please correct me if I’m wrong, virtually all video is compressed to some degree. Using completely uncompressed video is impractical because most computers can not handle that amount of information very well and it seemed to be unnecessary given our current displays/monitors.
However, having the best(least amount of) compression gives you the most flexibility in exporting a final product. (Also Vegas does not seem to be able to edit some types of compression formats, such as divx, some avi, etc.) \
So, getting to my question, it has to do with navigating or getting past the marketing propaganda that tends to promote confusion rather than understanding.
Are there a set of variables that you can hang your hat on(in other words, that all codecs can be described in terms of), such as Frame dimensions, frame rate, and bitrate, to ACCURATELY & OBJECTIVELY describe any type of video/format?
Is there a general ideal amount of compression when digitizing/editing from an analog source(let’s say with DVD being the final product.)? I know that depending on the type of motion & image complexity can vary, but is there a general standard that we can say will definitely will work for all types of video content? I’m guessing everyone with have their own experiential rule o’ thumb.
Are there any specific things to watch for, where information can be misleading? I know this is a HUGE topic, but I am hoping to get some clarification on a topic I don’t think think I’ve fully been able to wrap my head around.
Thanks in advance!