- August 6, 2014 at 8:29 pm
I’ve been wondering for a while if there’s a set bitrate for a given resolution and FPS after which increasing the value will not bring an improvement good enough for the average person to notice. I’ve been rendering 720p videos recorded with a capture card at 10 mbps. You can clearly see the difference in quality comparing the video rendered and the video uploaded to youtube (that uses a 6mbps bitrate for 720p). Is there something as a “lossless” bitrate for a given resolution?
- August 6, 2014 at 9:02 pm
Jan Ozer wrote generally about this as Ben Waggoner’s ‘power of 0.75″ rule.
The actual number may depend on content. Actions sports would require a higher number than talking heads. As you go up in frame size you don’t need to bring up the bit rate at a 1 per 1 level. He uses 0.75. I link to page 3 of the article but you might want to read the whole thing.
Understanding Bits per Pixel and Quality and the Power of 0.75 rule can help you get most of the way there.
Often it means doing some test encodes to arrive at the point where it’s hard to notice a change given the specific content you’re using.
- August 31, 2014 at 2:29 pm
That link was helpful, thank you. However, I was looking at something a bit different. That link is about the lowest bitrate you should choose without sacrificing quality too much, I was looking for the opposite: I was looking for the lowest bitrate after which you don’t see any improvement in quality. I know I could use 100mbps bitrate, but I also know that if I use 50mbps I won’t be able to tell the difference. I personally render at 10mbps at the moment, but I was wondering if there’s a way to calculate a specific value for what I’m looking for.
- August 31, 2014 at 9:37 pm
There are a lot of qualifiers that make it difficult to answer your question: codec, display size, average person, amount of motion, length of clips… As a general guideline for H.264 I find that 0.35-0.40 bits/pixel is reasonably close to blu-ray quality (0.65-0.80 bpp) when encoded with 2-pass VBR, a key frame distance of 10-15 frames; and viewed on a computer screen or smaller. Depending on the circumstance one could argue that the rate should either be doubled to Blu-ray quality or cut in half.
- September 1, 2014 at 12:30 am
Thanks for your reply. I have a couple of questions about your post:
1. What do you mean by “average person”?
2. How has lenght of clip anything to do with the bitrate/quality?
I’m working with 720p content. 0.40 bits/pixels means:
1280*720 = 921600 pixels
921600*0.40 = 368640 bits
368640*29,97 = 11048141 bits per second
So around 11 mbps, which is pretty close to what I use (10mbps). Your post was really helpful, thank you!
- September 1, 2014 at 2:41 am
1. In the original post you used the qualifier, “.. good enough for the average person to notice.” That’s difficult to determine.
2. A sequence of short clips (at the extreme, a series of discontinuous photos like this) requires higher bitrate because a larger percentage of the file is I-frames (which are full images) versus P- and B-frames (which are constructed from other frames and therefore much smaller). At the other extreme, a sequence several minutes long of a single static photo could be encoded at a lower bitrate.
Yes, that’s the calculation:
Mbps = Width x Height x FPS x bpp / 1024 / 1024
Log in to reply.