Creative Communities of the World Forums

The peer to peer support community for media production professionals.

Activity Forums VEGAS Pro Video Viewing Challenge/Puzzle

  • Video Viewing Challenge/Puzzle

    Posted by Dave Haynie on March 6, 2012 at 8:57 am

    In sparring with Mr. Bean here, he seems to have thrown down a gauntlet, which I though it might be fun to pick up. We had various claims about video compression verus resolution versus visual quality. He doesn’t seem to understand video compression at all; I’ve written video compression CODECs (ok, sure, not since the early 90s… though I actually do have another algorithm in mind).

    But, as they say, seeing is believing. Or at least in this case. And again, this might be a little fun, because I don’t know what the results will be… or if anyone’s even ambitious enough to play.

    To play, simple enough… go here: https://www.frogpondmedia/enctest. Here you will find seven seemingly similar AVC/MP4 files. However, all is not as it seems. Each is encoded in AVC at 16Mb/s, basically the Sony AVC Blu-ray profile. But each one was originally a different resolution/bitrate combination.

    It’s all the same video; a shortened version of the Sony GPU Test render. At least one of these was a DVD render; at least one was a typical YouTube render. The task: view them, rate them 1..7, where 1 is top quality, 7 is unwatchable drek. Extra credit if you can identify which is which. Obviously, anything that wasn’t full HD was upscaled… more details at the site.

    The uploads are still going (3:54AM EST)… probably good by daytime.

    -Dave

    John Bean replied 14 years, 1 month ago 4 Members · 14 Replies
  • 14 Replies
  • Steve Rhoden

    March 6, 2012 at 10:35 am

    lol…Thrown down a gauntlet…Thats a good one, particularly
    funny because Mr. Bean here seems to not understand a lot of
    things.
    Im busy editing, so i leave you guys to battle this one out.

    Steve Rhoden
    (Cow Leader)
    Film Editor & Compositor.
    Filmex Creative Media.
    1-876-832-4956

  • John Bean

    March 6, 2012 at 5:51 pm

    It seems somebody does know how to build a website? :p

    Link is dead: https://www.frogpondmedia/enctest

    Simple math really.

    An uncompressed 1080p video at 30 fps runs at 1.5 Gb/s. So how do go from 1.5 Gb/s to 6 Mb/s without losing a lot of information, thus causing a ton of compression artifacts?!

    So which is better?

    Upscaling a 480p video at a high bitrate to 1080p, or, decoding a 1080p from a very low bit-rate like 6 Mb/s?

    Upscaling algorithms vs compression algorithms?

    I know I would rather watch the DVD version of Transformers than watch a 1080p-6 Mb/s stream of it.

    Cheers!

  • Dave Haynie

    March 6, 2012 at 8:31 pm
  • John Bean

    March 6, 2012 at 11:20 pm

    Dave,

    I appreciate your efforts. That is the only way to learn.

    However, all it took was one download to see that the test case file you used is skewed heavily towards the best case scenario for any codec – be it AVC or MPEG-2.

    Your test case=> solid colors, smooth background, smooth objects (a car), very few objects, slow motion

    So of course, the results will look good even at low bitrates.

    But most videos are not car commecials (or a commercial for Vegas Pro using cars). They are of people interacting with each other in various environments (outdoors and indoors). So the complexity of the frames will be greater than your test case file.

    Here are some simpler tests.

    1. Get a DVD of your favorite movie. Something that is visually awesome. Say Transformers.

    Then compare it to a 1080p stream of it that you may find online. To make the test fair, make sure the stream has a bitrate of around 6 Mb/s. The max bitrate offered by YouTube.

    Even simpler: just compare high-quality TRAILERS found on DVDs to their YouTube 1080p counterparts. Again make sure the bitrates for the DVD trailers are high (7-9 Mb/s). YouTube maxes at 6 Mb/s.

    2. Or, ask yourself, would you be willing to shoot your next video project with one of those $200 consumer camcorders that are 1080p with low-bitrates of 4-6Mb/s. I mean, it’s 1920x1080p, right?!

    You wouldn’t would you? Why?

    Cheers!

  • Dave Haynie

    March 7, 2012 at 12:04 am

    [John Bean] “I appreciate your efforts. That is the only way to learn.”

    I have been learning for decades. You ought to try it sometime.

    [John Bean] “So of course, the results will look good even at low bitrates.”

    There’s plenty of motion in there. Others have already submitted answers, and they’re pretty much getting it “right”. If you’re planning to bail, it’s clearly because you can’t actually judge video yourself, you just like to play with numbers.

    So I’m waiting. Put up or shut up. There are very clear differences between these. Here’s another hint.. the lowest bitrate represented here it 1/8th of the highest bitrate. If you can’t tell those apart, you’re nothing but a poser.

    -Dave

  • John Bean

    March 7, 2012 at 1:11 am

    Dave,

    There is no need to get angry here.

    Now how is it bailing when your test cases are skewed to best case scenarios?

    And from the other thread, I was the first one to propose a *challenge* to you to go do a visual comparison of DVD movies and their 1080p 6 Mb/s online streaming counterparts. Did you do that test? Nope. Did I accuse you of *bailing*? Nope.

    Plus, why I am going to waste my internet bandwidth downloading all of your test cases when its clear you skewed them towards a best case scenario?

    Do a real test like I suggested. Here they are again:

    1. Get a DVD of your favorite movie. Something that is visually awesome. Say Transformers.

    Then compare it to a 1080p stream of it that you may find online. To make the test fair, make sure the stream has a bitrate of around 6 Mb/s. The max bitrate offered by YouTube. And DVD movies are around 8 Mb/s.

    Even simpler: just compare high-quality TRAILERS found on DVDs to their YouTube 1080p counterparts. Again make sure the bitrates for the DVD trailers are high (7-9 Mb/s). YouTube maxes at 6 Mb/s.

    2. Or, ask yourself, would you be willing to shoot your next video project with one of those $200 consumer camcorders that are 1080p with low-bitrates of 4-6Mb/s. I mean, it’s 1920x1080p, right?!

    You wouldn’t would you? Why?

  • Dave Haynie

    March 7, 2012 at 5:15 am

    [John Bean]
    Now how is it bailing when your test cases are skewed to best case scenarios?

    This is HARDLY a best-case video. Nor is it a worst-case video. It’s one that I could produce when the whim took me, that met all my criteria for a professionally done comparison test. This means a legally available video, with a very clean original in better-than-Blu-ray format. It also helped that much of the group also has this video downloaded… many of us were running the Sony benchmark last fall. For a proper test, the original must be available for comparisons, and I had no ability to upload such a file in the time available.

    In fact, I thought the “best” of the bunch was kind of gimme. Last I checked, every response I got so far picked that one correctly.

    [John Bean] “Plus, why I am going to waste my internet bandwidth downloading all of your test cases when its clear you skewed them towards a best case scenario?”

    Your downloads are so metered a few 100MB is a problem? That can only be true if you’re on satellite… or dialup.

    Why? Perhaps to be seen as a stand-up guy, rather than a troll? Or… as you previously suggested… maybe to actually learn something for a change, rather than just spew incorrect information.

    [John Bean] “Do a real test like I suggested. Here they are again:”

    This, again, is a completely real test. It’s done properly. If you prefer a different comparison test, feel free. But I strongly recommend you don’t steal other folks video and post or link it here. I know many of us have strong feelings against piracy in any form… as a published author in five media formats, I’m very much at the top of that list.

    The second reason — the test is only a test of the encoding and upscaling methods if it uses the same source. A DVD is differently mastered, and will skew the result — and not always in favor of the DVD, either. DVD video rates require lots more motion blurring to avoid macroblock noise than the equivalent Blu-ray video. Plus, since I included a Blu-ray-class video in the comparison, I can’t use Blu-ray quality as my source video. I must have something of higher quality. Otherwise, I’m imposing the Blu-ray compression artifacts on all of those lower resolution versions.

    Third reason — your test doesn’t actually test what I’m after. Yes, if you select a very busy action scene, even from some piece of drek like a Michael Bay film, you may get different results. The big problem with that is that, again, you need to start with clean video, which you aren’t going to get from either Blu-ray or DVD, even if you didn’t mind breaking the law in the process.

    Plus, that doesn’t actually advance this argument. I have already told you that I believe that an upscaled DVD can, some circumstances, look very good… perhaps better than lower bitrate HD. I have no reason to prove that, and given that you have previously stated that you believe this always to be the case (you seem to be changing your tune here, but you did, twice, claim it was mathematically impossible, even though you couldn’t do the math), you should jump at the chance to view my samples and prove that you’re correct.

    But you also misunderstand the test. You did inspire it, but I wouldn’t have wasted any time on that, even just sitting here trying to get Windows 8 to install on a VM (working now). There is one sample that, yes, is certainly the right answer. The rest of them… I really don’t know. There is not necessarily a second right answer. Maybe a few wrong answers, but this is video, and the goal is not accuracy or anything else, only what looks good to the viewer. My opinion, your opinion, everyone who’s taken the test — all valid. I’m interested in knowing which way things skew, because I have been digitally encoding video for over 20 years, and I find this an interesting thing to explore. This is NOT a trap, unless you make it one for yourself.

    [John Bean] “2. Or, ask yourself, would you be willing to shoot your next video project with one of those $200 consumer camcorders that are 1080p with low-bitrates of 4-6Mb/s. I mean, it’s 1920x1080p, right?!”

    I don’t actually know of any consumer 1080p camcorder with a limit of 4-6Mb/s… even the “Flips” that did 1080p recorded at 10Mb/s or so, and those are toys for children, not camcorders. I can think of several scenarios where I might run my pro AVC camcorder or either of my high-end consumer satellite cameras at low as 6Mb/s. Even the original HD Hero Pro ($199) does 1080p at 15Mb/s.. and LOTS of people use these for professional purposes (a friend of mine has some video shot on several of these debuting on Discovery Channel’s “Shark Week” this summer… of sharks, of course).

    I wouldn’t choose a toy camcorder for a variety of reasons. While it might technically support 1080p recording, if this is the “Flip” style toy camera/camcorder you’re alluding to (Flip itself is defunct.. Cisco dropped the brand last year), these have lens apertures too small for the sensor, so they get all kinds of diffraction blurring, even before you start to record. They don’t have a enough computation power to do good AVC encoding, so in all likelihood their 6Mb/s is nothing close to the quality of my Camcorder’s 6Mb/s.

    And that’s in turn not as good as the 6Mb/s I can encode in a PC using one of the better encoders around, 2-pass mode, all the time in the world for encoding. That won’t compare to my better camcorders at 24Mb/s or 28Mb/s, or my HDSLR at 44Mb/s. But it will look better than the 20-24Mb/s AVC produced by even high end consumer camcorders from, say, 2006-2007 vintage or earlier. AVC is a complex algorithm, and the encoder on a camcorder has to be able to encode in realtime at a power cost of probably about 2-4W or less. That’s why it wasn’t until 2009 or so that AVCHD camcorders exceeded the quality of HDV, even given the technically much better coding efficiency of AVC. To get a 2x gain in coding efficiency, you may need a 10x or 100x gain in computing power.

    My cell phone also supports 1080p recording (stock at 10Mb/s, hacked up to 45Mb/s last I checked), and I probably wouldn’t shoot a film on it. That hasn’t stopped some people, however … Google for Hooman Khalili and Pat Gilles’ film “Olive”. You’ll find that a recent box office hit, “Act of Valor”, was shot on Canon 5Ds, which don’t even support the full range of the AVC spec (perfectly good AVC for sure, but they don’t have the processor performance to encode B frames, which is one reason they need to record at 44Mb/s… their 44Mb/s is not close to twice as good as the 22Mb/s-24Mb/s you’ll find on many “AVCHD” labelled camcorders. Good, yes, but not as good at might sound from the bitrate numbers).

    -Dave

  • Nigel O’neill

    March 7, 2012 at 1:36 pm

    Dave/Steve

    Anyone who posts on this forum using an obviously pirated image for an profile picture surely cannot be taken seriously. Much of the advice provided by John is either so far off topic or irrelevant that I find it becoming quite tiring to read through some of the posts that take off on tangents and lead you on a wild goose chase.

    Whilst he is trying to be helpful, some of the advice provided actually is too generalised and as such has little to do with the issue at hand.

    My system specs: Intel i7 970, 12GB RAM, ASUS P6T, Vegas Pro 10e (x32/x64), Windows 7 x64 Ultimate, Vegas Production Assistant 1.0, VASST Ultimate S Pro 4.1, Neat Video Pro 2.6

  • John Bean

    March 8, 2012 at 9:01 pm

    Dave,

    I have not change my position at all. If there is any confusion it is because there are two kinds of compressed 480p videos we are talking about here: 480p AVC and 480p MPEG-2. And depending on which one we are talking about, the differences in quality from 1080p low-bitrates will vary.

    Similarly, in our discussion, there are two 1080p-AVC videos: YouTube 1080p-AVC which varies between 3-6 Mb/s and a straight 1080p-AVC video at exactly 6 Mb/s.

    I’ve also already pointed out to you in the other thread your claims of 1080p superiority (including YouTube 1080p) even at very low-bitrates like 6 Mb/s over 480p videos at higher bitrates, regardless if it is 480p AVC or 480p MPEG-2.

    Again I appreciate the fact that you went to all that trouble creating your experiment for us. I will take your word that you did not purposely skewed your test case towards the best case scenario. But remember, I never asked you to create your experiment.

    All I asked was to use basic common sense: like MATHEMATICS. The BIT-RATE setting is a huge factor to video quality.

    It seems like you are denying this basic fact. Your argument seems to be based on some belief that the BIT-RATE has no effects on quality.

    So let’s use some basic math again.

    FS = frame size = WIDTH * HEIGHT = W * H
    CD = color-depth
    CF = compression factor
    FPS = frames per second
    BPP = average bits per pixel = CD / CF
    BR = bit-rate
    Q = video quality = signal-to-noise-ratio = PSNR

    I will use ‘<~’ to mean the PROPORTIONAL sign from mathematics.

    It should be obvious that: Q <~ 1 / CF

    INCREASE the CF value, then you decrease Q
    DECREASE the CF value, then you increase Q

    So what we then have is this:

    BR = FS * BPP * FPS
    BR = W * H * ( CD / CF ) * FPS

    BR <~ 1 / CF

    Hence, Q is directly proportional to BR:
    Q <~ BR

    =>INCREASE the BR value, then it INCREASES the Q value
    =>DECREASE the BR value, then it DECREASES the Q value

    Let’s compare CF for 480p AVC/MPEG-2 to 1080p-6 Mb/s AVC, using FPS of 24 and CD of 24.

    CF = ( (W*H) * CD * FPS ) / BR

    480p-6Mbs AVC:
    CF = ( (720p*480p) * 24 bpp * 24 fps) / 6 Mbs
    = 33.1776

    480p-8Mbs MPEG-2:
    CF = ( (720p*480p) * 24 bpp * 24 fps) / 8 Mbs
    = 24.8832

    1080-6Mbs AVC:
    CF = ( (1920p*1080p) * 24 bpp * 24 fps) / 6 Mbs
    = 199.0656

    Now for average test cases, most experiments put AVC at about a 5-to-3 *efficiency improvement* over MPEG-2. This means, AVC can encode with 3/5 times less bits than MPEG-2 to achieve the same PSNR quality result.

    So let’s multiply the 480p-8Mbs MPEG-2 CF value by 5/3:

    CF = 24.8832 * (5/3)
    = 41.472

    So in terms of Q (PSNR) so far, the ranking we have is:

    480p-AVC > 480p-MPEG-2 > 1080p-AVC

    At the given bitrates, a 480p-AVC and 480p-MPEG-2 will have much less compression artifacts than a 1080p-AVC at 6 Mb/s.

    So the debate now is this:
    if we UPSCALE these 480p videos to 1080p, how much will UPSCALING affect Q?

    Define UF=upscale factor. Then:

    UF = (TARGET FRAME SIZE) / (INITIAL FRAME SIZE)

    It should be self-evident that:
    Q <~ 1 / UF

    INCREASE UF, then Q DECREASES
    DECREASE UF, then Q INCREASES

    Then, we get:
    Q <~ 1 / (CF * UF)

    480p UPSCALED to 1080p:
    UF <~ (1920*1080) / (720*480)
    UF <~ 5.53846

    Then,

    480p-6Mbs AVC:
    CF * UF = 33.1776 * 5.53846 = 183.75

    480p-6Mbs MPEG-2:
    CF * UF = 41.472 * 5.53846 = 229.69

    So in terms of Q, we get:

    480p-6Mbs AVC upscaled to 1080p:
    Q <~ 1 / 184

    480p-8Mbs MPEG-2 upscaled to 1080p:
    Q <~ 1 / 230

    1080p-6Mbs (unscaled)
    Q <~ 1 / 200

    Of course, it should be reminded that we are talking about proportional relationships and not exact values for Q.

    Depending on the UPSCALING algorithm, these Q factors for the upscaled 480p videos may be higher or lower. I openly acknowledge that there may be a difference here since I do not have any data for comparing how the best UPSCALING ALGORITHMS will affect Q. If you do, please share.

    If the Q factors I have calculated here hold true, then to the human vision system, the differences between Q factors of 1/184, 1/200, 1/230, is not going to be that great. So to a human, you will not see much of a difference in terms of visual quality.

    A 1080p video degraded to low bitrates of 6 Mb/s has already dug itself a deep hole that a 480p video at a high bit-rate can in most cases can be better in quality – if not better, then equal. Especially in the cases where we are using the same video compression codec.

    Downgrading a 1080p video to low bitrates like 6 Mb/s is such a huge compression factor, that as shown by these Q factor calculations, it is almost similar to just down-scaling the video to 480p instead!

    A TROLL is somebody who does not back-up their claims with COMMON SENSE, like mathematics.

    Somebody who backups their claims that contradicts with your claims is not a TROLL. Somebody who doesn’t back up their claims could be consider a TROLL.

    Cheers!

  • Dave Haynie

    March 9, 2012 at 5:57 pm

    You are still missing the most fundamental issue which, which I explained before. While yes, mathematics are involved (and we’ll get to the part of that you’re ignoring in a few… I’m not staying long), this is not a mathematics problem. It’s a psychology problem.

    In short, you cannot predict with any level of certainty how an individual will perceive the encoded video, when you’re mixing different kinds of video. All video compression algorithms are designed to fool the brain, and more advanced algorithms work better: MPEG-2 better than DV, MPEG-4/ASP better than MPEG-2, AVC better than ASP, and perhaps this new H.265 (claiming a 33-66% coding efficiency improvement over AVC) better still.

    Bottom line: you have to look. With your eyes. The experiment I put up here presents just such an opportunity. Maybe I’ll find some other higher quality source and run another set of clips; I’m getting some feedback from a variety of sources, and a few of the better rated clips aren’t what I would have predicted. Your continued refusal to rate these sure suggests you don’t trust your own eyes. That is the only thing that actually matters here.

    YouTube uses an undocumented algorithm to decide what bitrate to encode, based on the incoming video. I have never had a problem getting it to deliver 6Mb/s AVC for my 20Mb/s well shot AVC 1080p uploads. I haven’t analyzed all possible outcomes of their process; I do know that 720p is often (maybe not always) encoded at 2Mb/s. And that’s actually fairly well accepted by consumers are “better than DVD”… I don’t always agree.

    But look at Netflix. Their “HD” streaming runs at 2.4-3.6Mb/s last I checked, using Windows Media 9 (aka VC-1), which offers somewhat less encoding efficiency than AVC, certainly more than MPEG-2. I’m not defending Netflix against Blu-ray… I always want the best possible quality for my viewing. But when it’s DVD vs. Netflix, the average consumer will pick Netflix HD based on their judgement of visual quality.

    In the early days of HD video, Microsoft offered up a format using similar techniques, dubbed WMV/HD DVD. Before Blu-ray became possible, I had figured out how to make these (pretty much by hand… no authoring tools other than the WMV encoder), and they called for WMV in 720p at about 3-4Mb/s. I happened to have a red laser DVD player that supported this format, in HD, as well. It was unambiguously better than DVD… this was before upscaling got to the point of actually working well; I haven’t done a comparison.

    This was actually used on a few commercial discs… I have HD Net’s “Enron: The Smartest Guys in the Room” on WMV/HD-DVD. Worked much better when the film was only 1-1.5 hours (and I had a special interest in the film, having once had business dealings with Enron, though a weird set of connections).

    When you’re considering modern algorithm, too, as I mentioned, one of the main impacts on quality has nothing to do with the I-Frame (keyframe) mathematics you seem so preoccupied with. Yes, AVC offers more efficiency in I-Frame encoding than MPEG-2 or DV; that’s the whole point of AVC-Intra. But we’re not dealing with AVC-Intra. Camcorder use of AVC is going to always use fixed GOP sizes, so there’s at least a prayer of editing it. Delivery formats are no so kind, and of course, they can spend a very long time on motion search.

    You can’t really factor motion search in easily, because that’s not really part of AVC or MPEG-2, not arithmetically. There’s a black box in there for “motion search”, but every encoder has its own approach. The things used on Blu-ray and YouTube can’t be done in a camcorder… and in fact, never will be. And because of this, you can argue about mathematics until you’re blue in the face (and you seem to have been… and they are getting more correct, so at least you’re learning SOMETHING), but that won’t give you the answer. Only your eyes will.

    I thought my web test was fair, particularly since, as I said, there was only one “right” answer. But I’ve had a few people now pick one of those I’d suggest might be second best as best, and there’s even of the videos with upscaling getting pretty high in the ranks. No plays to post a “gimme”, either, but maybe another clip set with the same encodings. Or not, if I’m too busy (it’s actually less than an hour, all told, to make)…

    -Dave

Page 1 of 2

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy