- January 20, 2016 at 9:22 pm
Of course the overall quality goes down and the bitrate is lowered, but is the actual pixel count lowered? If so, how? When you upload a video it is usually transcoded into various bitrates, but is there actual upscaling/downscaling done during the transcoding process as well? That is to say, are multiple renditions of the same video created at different resolutions? Say you have you have one master uncompressed HD video file to stream in an abr format like hls… when you upload/encode the video should you (or the backend transcoding system) be making various renditions at different resolutions AND bit rates, or just transcoding to different bitrates? What is the player actually doing to the video when you change the resolution if it does not have pre-encoded video at multiple resolutions to choose from?
- January 20, 2016 at 9:58 pm
What the different services do may actually change over time. There’s been some articles in a few trade publications on the changes Netflix is experimenting with. I don’t doubt Vimeo and YouTube (Google) make changes over time. YouTube has even had alternative codecs at times.
I don’t think bits per pixel is maintained over different frame sizes and data rates. Some tests have shown that isn’t linear anyway as some codecs become more efficient as data rate increases. I also wouldn’t doubt profile changes between Baseline, Main, High. The use of those I believe have changed over time as the need for Baseline has declined in all but the oldest devices.
A long while back I remember YouTube suggesting you upload the best possible master file as they may use it to re-encode your files in the future. They use have separate “content creator” specs but they’re uniform now.
In short, whatever one says the services are doing today, may not be true tomorrow.
- January 21, 2016 at 2:52 pm
Thanks for the response.. does that mean the video hosting or streaming service is storing the same video, “video x”, in, for example, 1080 HD at bitrates A, B and C, 720 at bitrates D, E, and F and 480 at bitrates G, H, and I for adaptive bitrate streams (HLS, HDS, Dash). This is what I would think..because you can change the “resolution” from the player, however all of the streaming architecture I can find online makes it seem like 1 HD video is delivered at various bitrates without having different resolutions to choose from. But that doesn’t explain the ability to change the “resolution”. Perhaps I am getting caught up in the verbiage, or maybe the downscaling is done in real time on the player side? I am just trying to get a solid idea of exactly what media needs to be either uploaded to, or transcoded by, the hosting service for adaptive bitrate streaming. If there are not multiple resolutions created (like there are bitrates) then I would be correct in assuming the player must have the ability to downscale/upscale the pixel count in realtime.
Log in to reply.