Creative Communities of the World Forums

The peer to peer support community for media production professionals.

Activity Forums VEGAS Pro Turn 480p clips into 720p: is it possible?

  • Turn 480p clips into 720p: is it possible?

    Posted by André Martins on March 1, 2012 at 9:01 pm

    Hello everyone,

    I’ve got a big problem: Today I went on the 2nd day of shooting for a short film that I am doing and I didn’t notice my Canon 550D’s recording settings were set to record 480p, instead of 720p. So, now I’ve got a one day worth of shooting all in 480p, whilst the 1st day was shot on 720p and I intended on the “final-product” to be in HD.

    I know (or, at least, can guess) that turning 480p clips into 720p ones is pretty much impossible in the way that the 480p ones won’t compete with the quality of the ones in HD, but I would like to know if you guys have any suggestions for ways to make this “conversion” smoother .

    Thank you,
    André Martins

    P.S.: I use Sony Vegas Pro 10 .

    Steven Villman replied 8 years, 10 months ago 6 Members · 22 Replies
  • 22 Replies
  • John Bean

    March 1, 2012 at 9:28 pm

    QUALITY reality depends on the BIT-RATE.

    For example: Your 480p DVD clips are better quality than YouTube’s 1080p clips because the bit-rate is much higher. Your 480p DVD clips can upscale much better on a 52″ 1920×1080 HDTV than YouTube’s 1080p clips.

    The higher the video’s bit-rate, the higher the resolution your video can upscale to before you start noticing visual artifacts.

    So assuming your 720p clips have a higher bit-rate than your 480p clips, then it is going to be very difficult to upscale your 480p clips to the same *quality* as your 720p clips. Your 480p clips will have noticeable artifacts that do not exist in your 720p clips.

    So you are stuck with your LOWEST DENOMINATOR=>your 480p clips

    So my suggestion is as follows:

    1. Setup a PROJECT for 480p.
    2. Edit all your clips (both 480p and 720p) as you desire.
    3. Then render out to a 720p video.

    For #3, try encoding bit-rates:
    1. equal-to your 480p source bit-rates
    2. slightly greater than your 480p bit-rates
    3. significantly greater than your 480p bit-rates but less than your 720p bit-rates (the median between your 480p and 720p bit-rate is good choice).
    4. equal-to your 720p source bit-rate

    Then do a VISUAL COMPARISON. Pick the version that gives the highest bit-rate without any VISUAL ARTIFACTS.

    Note: if you are just uploading to YouTube, as long as your bit-rate is greater 6 Mbps, nobody should notice!

  • André Martins

    March 1, 2012 at 9:42 pm

    Thank you for the great answer.

    I’ve checked and my 720p clips bit-rate is around 45.8 Mbps, whilst the 480p’s is 22.7 Mbps :S

    Oh, and there’s another thing that will make the whole process more difficult: I will have to crop the 640×480 videos to match the 1280×720’s aspect .

  • John Bean

    March 1, 2012 at 10:10 pm

    The 22.7 Mbps for 480p should be high enough to work with.

    Some blu-ray 720p movies are encoded with bit-rates as low as 16 to 18 Mbps AVC.

    So your 480p clips are 4:3?

    Hopefully you focused well and gave enough distance when filming, otherwise a lot of your cropping will appear like closeups.

  • André Martins

    March 1, 2012 at 10:16 pm

    That’s a good new .

    About the 480p, they’re more of a square than the 720p.

    Fortunately, I did, indeed, focus everything right. But, about the perspectives, I frammed the shots without the intention of cropping them in post, so that’s going to be another problem .

  • Dave Haynie

    March 2, 2012 at 5:10 am

    In short, no, you can’t really upscale a 480p clip to 720p and have it pass for 720p in a general way. A 480p clip contains 1/4 of the information of the same material at 720p.

    But all is not necessarily lost. You will see the most difference between 480p and 720p on material with lots of high frequency information. The latest intricately rendered robot and spaceship film,a football game (either sort), etc. will look increasingly better as you increase the resolution. Some of that’s lost if you overcompress the video, but only so much — modern video compression is extremely good.

    But there’s plenty of material that contains little really important high frequency information. You might notice an improvement watching the latest RomCom or Indy flick (assuming it wasn’t shot in SD) in 720p or 1080p vs, 480p, but it’s probably not profound. Now, intelligently upscale that 480p, and the difference is even less.

    The good news is that you have very good quality 480p.. the Canon AVC encoding at 480/60p is better for most purposes than DV (it will suffer, as all interframe compression algorithms, on fast motion video, but less so at 60p). The worst giveaway of an upscaled low-resolution video are visibly upscaled compression artifacts… and you can’t really do much about those.

    The second worst giveaway of an upscaled low-resolution video is pixelization. If you simply blow up your video, it’ll look chunky when sitting next to real 720p video. But as any old Blu-ray player can show you today, there are more intelligent upscaling algorithms that deliver video that’s generally accepted as better than the original 480p. Obviously, it can’t create the 3/4 of the information that’s missing from a real 720p shot, but it can use algorithms to enhance your 480p in a way that’s comparable to 720p. These upscalers typically combine a smooth upscaler with an edge enhancement algorihm.

    There’s a decent UpRez plug-in in Boris Continuum Complete… and in fact, there’s a video with our own John Rofrano showing it off here:
    https://www.borisfx.com/videos/BCCVegas/Uprez.php.

    -Dave

  • Dave Haynie

    March 2, 2012 at 5:30 am

    That’s not right. He’s got very high quality 480p, but it’s still just 480p. If you upload very good quality 1080/24p to YouTube (I’ve sent them 20Mb/s AVC in the past), they will render it using the excellent x264 AVC CODEC to 1080/24p at about 6Mb/s.

    While this isn’t anything to write home about, it’s actually pretty close to the lowest bitrate setting on my Panasonic HMC40. AVC at 6Mb/s is going to suffer terribly on fast motion, but it’s going to look much better than anyone’s 480p when displayed on a good television… not to mention the fact that the 1080p doesn’t need to be upscaled, only the 480p does.

    The bitrate is going to be a very big factor when it comes to the quality of upscaling, which he’s got to do here. When you upscale compressed video, you’re upscaling the compression artifacts, too. The Canon 480p should look better upscaled than regular DV-class video, as long as he’s not shooting overly fast motion video.

    You also can’t judge strictly by bitrate. The AVC used on YouTube and in most camcorders is full IPB-Frame encoded AVC. And the YouTube compressor doesn’t have to try to compress it in realtime with a 3W power budget, so it can actually do a better job at the same bitrate, particularly for lower bitrates. Canon’s current camera models use only I and P Frames in the AVC compression, thus the 44Mb/s at full 1080p mode…. close to twice that of the typical AVCHD/AVCCAM/etc. full fledged camcorder. And many other company’s HDSLRs. Not to knock the quality .. I have a 60D myself, and it’s my go-to camera for any low-light shooting, period.

    -Dave

  • John Bean

    March 2, 2012 at 7:26 pm

    Oh NO! By no means did I say or suggest that the BIT-RATE value is the *sole* indicator of quality for a video.

    In the same token, in no ways is just knowing the FRAME-SIZE a good indicator of the quality for a video, if we assume LOSSY compression is involved (which is the case 99% of the time).

    Just because a video claims to be 1080p does not mean it is a good quality video that will look great on a 1080p TV or monitor such as a 52″ 1920×1080 HDTV.

    Based on *visual comparison* only, you can verify that a YouTube 1080p video will not upscale as good as a DVD 480p (or i) video at bit-rates equal-to or greater than YouTube’s 6 Mb/s for 1080p.

    A DVD video can have a MAX bit-rate of 9.8 Mb/s. Most DVD videos average around 8 Mb/s.

    And if you do the MATH as well, you can verify mathematically why a DVD 480p video is more often then none better quality than a YouTube 1080p.

    That is why sites like YouTube do not advertise their BIT-RATE values. They only advertise 1080p!

    Similarly, many cameras will also advertise that they do 1080p video capture. But what they don’t advertise is the BIT-RATE! If you buy a 1080p camera but that camera only captures at bit-rates like YouTube’s 6 Mps, then your 1080p footage will suck!

    In the context of this thread, what I said was a generality: in general, knowing the FRAME-SIZE and BIT-RATE will give you a *good* indication (estimate) of the quality of the video in comparison to other videos.

    And in the context of this thread, we know the FRAME SIZE, BIT-RATE, SCAN-TYPE, and CODEC. We then, can also assume typical values for FRAME RATE (24, 30, 60).

    So based on the context of this thread, on the surface, one can safely conclude with *great accuracy* that Andre’s 480p @23 Mb/s video is also a much HIGHER QUALITY than YouTube’s 720p and 1080p videos.

    By QUALITY, we mean how LOSSY is the video compressed. The closer the compressed video is to representing its ORIGINAL UNCOMPRESSED self, the smaller the LOSSY factor, and hence, the HIGHER the QUALITY.

    To definitively conclude which video is of HIGHER QUALITY you need to know:
    1. Frame size
    2. Frame rate
    3. Bits per pixel
    4. The CODEC type and settings

    To illustrate, let’s compare these two videos:

    Video#1: 1920x1080p (23.976 fps, 8 bpp) @6 Mb/s
    Video#2: 720x480p (23.976 fps, 8 bpp) @8 Mb/s

    First, we will do a LOSSLESS compression analysis.

    For Video#1: 1920x1080p (23.976 fps, 8 bpp) @6 Mb/s

    Bits per frame = pixels per frame * bits per pixel
    = (1920×1080 ppf) * 8 bpp
    = 16588800 bits/frame
    ~= 16588.8 kb/frame

    Bits per second = bits per frame * frames per second
    = 16588800 bpf * 23.976 fps
    = 397733068.8 bps
    ~= 397.7 Mb/s

    =>for a 1920x1080p (23.976 fps, 8 bpp) video, in its uncompressed form, it has a BIT-RATE of 397.7 Mb/s.

    Compression ratio = target bit rate / uncompressed bits per second
    = 6 Mbps / 397.7 Mbps
    = 6000000 bps / 397733068.8 bps
    = 0.0150854944
    ~= 66.2 to 1

    =>to achieve a TARGET BIT-RATE of 6 Mb/s, for a LOSSLESS compression, a video stream of 1920x1080p (23.976 fps, 8 bpp) requires a compression ratio of about 66.2 to 1.

    That is, for LOSSLESS compression, roughly 66.2 bits of uncompressed video information must get encoded to 1 bit to achieve a TARGET BIT-RATE of 6 Mb/s.

    Or stated differently, for LOSSLESS compression, 1 bit of compressed video information must decode to 66.2 bits of uncompressed (original) video information, if the desired TARGET BIT-RATE is 6 Mb/s.

    On a PER FRAME basis, a 1920×1080 video frame compressed by a factor of 0.0150854944, gives:
    => bits per frame * compression ratio
    = 16588800 bpf * 0.0150854944
    = 250250.2495 bits/frame
    ~= 250 kb/frame

    =>for LOSSLESS compression, a video stream of 1920x1080p (23.976 fps, 8 bpp) at 6 Mbps has 250 kb/frame of information.

    =>to obtain a TARGET BIT-RATE of 6 Mb/s for LOSSLESS compression, the encoding video codec must compress 16588.8 kb/frame of ORIGINAL UNCOMPRESSED video information into about 250 kb/frame.

    For Video#2: 720x480p (23.976 fps, 8 bpp) @8 Mb/s

    Bits per frame = pixels per frame * bits per pixel
    = (720×480) ppf * 8 bpp
    = 345600 bits/frame
    = 345.6 kb/frame

    Bits per second = bits per frame * frames per second
    = 345600 bpf * 23.976 fps
    = 8286105.6 bits/s
    ~= 8.286 Mb/s

    =>for a 720x480p (23.976 fps, 8 bpp) video, in its uncompressed form, it has a BIT-RATE of 8.286 Mb/s

    Compression ratio = target bit rate / uncompressed bits per second
    = 8 Mbps / 8.286 Mbps
    = 8000000 bps / 8286105.6 bps
    = 0.965471644
    ~= 1.0358 to 1

    =>to achieve a TARGET BIT-RATE of 8 Mb/s, for a LOSSLESS compression, a video stream of 720x480p (23.976 fps, 8 bpp) requires a compression ratio of about 1.0358 to 1.

    That is, for LOSSLESS compression, roughly 1.0358 bits of uncompressed video information must get compressed to 1 bit to achieve a TARGET BIT-RATE of 8 Mb/s.

    Or stated differently, for LOSSLESS compression, 1 bit of compressed video information must decode to 1.0358 bits of uncompressed (original) video information, if the desired TARGET BIT-RATE is 8 Mb/s.

    On a PER FRAME basis, a 720×480 video frame compressed by a factor of 0.965471644, gives:
    => bits per frame * compression ratio
    = 345600 bpf * 0.965471644
    = 333667.0 bits/frame
    ~= 333.7 kb/frame

    =>for LOSSLESS compression, a video stream of 720x480p(23.976 fps, 8 bpp) at 8 Mbps has 334 kb/frame of information

    =>to obtain a TARGET BIT-RATE of 8 Mb/s for LOSSLESS compression, the encoding video codec must compress 345.6 kb/frame of ORIGINAL UNCOMPRESSED video information into 334 kb/frame.

    TO SUMMARIZE:

    To achieve LOSSLESS compression

    Video#1=1920x1080p (23.976 fps, 8 bpp) @6 Mb/s
    => 250 kb/frame of COMPRESSED information must decode back to 16588.8 kb/frame of ORIGINAL UNCOMPRESSED information to achieve LOSSLESS compression

    Video#2=720x480p (23.976 fps, 8 bpp) @8 Mb/s
    => 334 kb/frame of COMPRESSED information must decode back to 345.6 kb/frame of ORIGINAL UNCOMPRESSED information to achieve LOSSLESS compression

    This illustrates who difficult it is for YouTube to obtain LOSSLESS compression. Trying to compress 16588.8 kb/frame of ORIGINAL UNCOMPRESSED information to 250 kb/frame is impossible! (or nearly impossible?)

    In other words, the only way to achieve a TARGET BIT-RATE of 6 Mb/s for a 1920x1080p video, is to compress it to the MAX using a LOSSY compression codec! So a ton of video information is being thrown away!

    If LOSSLESS compression was possible for a 1920x1080p-6Mb/s video, then of course, the 1920x1080p-6Mb/s video is of HIGHER QUALITY than a 720x480p-8Mb/s video. But this is not possible!

    In comparison, for a 720x480p (or i) video at 8 Mb/s, it is much easier to achieve LOSSLESS compression. Less compression is needed.

    Which is of HIGHER QUALITY for LOSSY compression?

    Let’s assume we take a UNCOMPRESSED 1920x1080p video and then scale it down to create an UNCOMPRESSED 720x480p video. So we now have two videos of different frame sizes of the same video.

    Now if we use the same LOSSY codec to compress these two videos, a 720x480p-8Mb/s video will contain more *accurate* information than a 1920x1080p-6Mb/s video.

    This is easily seen by comparing the bits per frame requirement for LOSSLESS compression:
    720x480p-8Mb/s => 334 kb/frame
    1920x1080p-6Mb/s => 250 kb/frame

    =>So using the same LOSSY codec, the 720x480p-8Mb/s video can compress to 334 kb/frame, while the 1920x1080p-6Mb/s video can only compress to 250 kb/frame.

    =>So using the same LOSSY codec, the 720x480p-8Mb/s video is of HIGHER QUALITY than the 1920x1080p-6Mb/s video.

    =>Hence, in this scenario, the 720x480p-8Mb/s video can upscale back to 1920x1080p much better than the 1920x1080p-6Mb/s video.

    =>THIS IS WHAT I MEAN when I say you can *generally* have a good estimate of how good the quality of a video is … in comparison to other videos if you only know its FRAME-SIZE and BIT-RATE.

    It is by no means definitive, but most of the time using this quick estimation will prove to be correct.

    BUT in reality, YouTube uses AVC and DVD uses MPEG-2 (both are LOSSY codecs).

    AVC is a better compressor than MPEG-2 in that it can compress more information of the same video.

    So to get a more *accurate* assessment of QUALITY when comparing 1920x1080p-6Mb/s AVC video to that of a 720x480p-8Mb/s MPEG-2 video, you will need to calculate their EFFECTIVE BITS per FRAME information.

    The EFFECTIVE BITS per FRAME value will tell us how many bits per frame after DECOMPRESSION (decoding) is is equal to the ORIGINAL UNCOMPRESSED bits per frame. In other words, how LOSSY is the compression.

    But most of the time, you will not be able to access the ORIGINAL UNCOMPRESSED video to make this analysis.

    Even still, you can safely guess and be right most of the time that a DVD video of 720×480 @8Mb/s is of HIGHER QUALITY than a YouTube video of 1920x1080p @6Mb/s.

    Hopefully, my analysis is sound here. If not, please do correct me! :p

  • Dave Haynie

    March 2, 2012 at 8:55 pm

    [John Bean] “Based on *visual comparison* only, you can verify that a YouTube 1080p video will not upscale as good as a DVD 480p (or i) video at bit-rates equal-to or greater than YouTube’s 6 Mb/s for 1080p.”

    Not to be pedantic here, but check you math… a 6Mb/s YouTube 1080p video doesn’t get upscaled for display on a 1080p monitor… that’s it’s native resolution. And on most material, it’s going to look better on that 1080p display than anything you get from any 480p video, regardless of quality. This is why most pros shoot in HD today, even if the delivery is going to be SD… a decent HD camcorder will produce better video than the best SD camera that’s ever existed.. and at least match it when converted to SD.

    You also need to stop doing that math… compression bitrates are only important comparing apples to apples. HDV at 25Mb/s isn’t going to look as good as DVCAM at 50Mb/s, which in turn isn’t going to look as good as DVCAM at 100Mb/s. You’re actually varying color decimation as well as bitrate there, but close enough. It’s apples vs. apples.

    Going to MPEG-2 or AVC, it’s an entirely different story. This is why DVD at 5-9Mb/s looks at good, occasionally better, than DV at 25Mb/s… particularly when professionally mastered (eg, not left up to camcorder algorithms that have to run in realtime at 3W or less). The bitrates are simply not comparable at all.

    As for lossless compression… no one’s doing that. It isn’t even necessary.. we humans don’t care about much of the information that could be captured. For uncompressed HD at 1080p24, you’re going to need about 1.2Gb/s… and that’s just for 24-bit color. Even digital cinema cameras aren’t usually recording uncompressed… there’s no reason consumers care about this. That’s 25x-50x more information than recorded by most professional camcorders or HDSLRs … and no one’s complaining.

    -Dave

  • John Bean

    March 2, 2012 at 10:05 pm

    I should of used the word *decode* instead of upscale.

    “Based on *visual comparison* only, you can verify that a YouTube 1080p video will not *decode* to 1920×1080 as good as a DVD 480p video can *upscale* to 1920×1080 at bit-rates equal-to or greater than YouTube’s 6 Mb/s for 1080p.”

    On a 1920×1080 HDTV or monitor, compare a good quality DVD movie to the best quality YouTube 1080p video you can download. It doesn’t even come close that the DVD movie is better!

    And this is because a 720x480p-8Mb/s video has *potentially* more information per frame to accurately decode to 720×480 and then up scale to a higher resolution like 1920×1080.

    A 1920x1080p-6Mb/s video does not contain 1920×1080 pixels of information! It contains very much less information to decode with. Much less than what a 720x480p-8Mb/s video has to decode with. The analysis was done in my previous post.

    And of course the video CODEC used matters when comparing video quality! I clearly stated this and made sure to make it clear during my analysis.

    The video codec and the settings used for it, must be equal or similar in quality in order to better estimate which video is of better quality if all you know besides the codec is the FRAME SIZE and BIT-RATE.

    And in the context of this thread, we are for the most part talking about videos that have similar quality codecs.

    And my analysis of LOSSLESS compression was needed to demonstrate why a 720x480p-8Mb/s video can potentially contain more accurate information to decode with than a 1920x1080p-6Mb/s video, when a LOSSY codec is eventually used.

    All in all, the point I making was making to Andre (the OP) was that a video with a bigger FRAME SIZE is not necessarily going to have a better quality than a video with a smaller FRAME SIZE, especially when there is huge disparity in the BIT-RATE values.

    Assuming the video codecs are equal or similar: in most scenarios, you are better off to have a smaller FRAME SIZE video with a HIGH BIT-RATE than to have a bigger FRAME SIZE video with a LOW BITRATE. In other words, a HIGH BITRATE low resolution video that is UPSCALED to a higher resolution – more often then none for the standard cases – will be better than a video at that same higher resolution but has a LOW BITRATE.

    Cheers!

  • John Bean

    March 2, 2012 at 11:32 pm

    It should also be noted that:
    a video’s BIT RATE includes information about its FRAME-SIZE (resolution)

    A video’s FRAME-SIZE is an integral part of its encoded BIT-RATE value.

    In most scenarios, videos of higher bit-rate should also have a higher FRAME-SIZE.

    For example, as in Andre’s case:
    480p->23 Mb/s
    720p->47 Mb/s

    So it should be no surprise that a YouTube 1080p-6 Mb/s video is of lower quality than a 480p-8 Mb/s video.

    Thus, a video’s BIT-RATE value provides us with more information about the quality of a video than the video’s FRAME-SIZE.

Page 1 of 3

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy