Creative Communities of the World Forums

The peer to peer support community for media production professionals.

Activity Forums Adobe Premiere Pro Why is YUV still used ?

  • Why is YUV still used ?

    Posted by __peter__ on May 29, 2010 at 9:11 pm

    Hello 🙂

    I know this is not Premiere specific, but I wonder why is that strange YUV still used today. Makes everything so complicated. Moving things between applications that do or do not work with that YUV is really a pain. Could be so easy with everything using RGB 0-255. There isn’t still analouge technique around. So why doesn’t the industry make the switch ?

    Maybe there are reasons that I don’t see and therefore asking You for Your opinions.

    Thanks!

    Peter

    Brian Louis replied 15 years, 11 months ago 4 Members · 4 Replies
  • 4 Replies
  • Vince Becquiot

    May 29, 2010 at 10:44 pm

    Are you asking the video industry to adopt one standard?

    By the way, all NTSC digital broadcast these days use Y’CbCr.

    As I understand it, it would be a lot harder to efficiently compress RGB.

    Vince Becquiot

    Kaptis Studios
    San Francisco – Bay Area

  • Tim Kolb

    June 3, 2010 at 7:18 am

    Actually, “YUV” never existed as such…it was a scaling method used in an intermediate step when encoding composite NTSC.

    Color difference encoding is actually notated “Y’PbPr” for analog and “Y’CbCr” for digital these days.

    The “Y'” is the black and white signal and the accent indicates that their is a gamma curve (as opposed to a power curve) instead of a linear relationship in the gray scale.

    Originally in the USA, this system allowed black and white television owners to stay in the game as their sets just didn’t read the other part of the signal…where the color is.

    Most people assume that “Pb/Cb” and “Pr/Cr” correlate to red and blue…not the case.

    Think of these two values as longitude and latitude. We aren’t finding our way around the globe…but around the vectorscope. Some vectorscopes will actually label the horizontal axis and vertical axis. the vectorscope in PPro uses the old Betacam component nomenclature of “R-Y” and “B-Y”.

    The “Y'” signal contains the luma value for each pixel…so the only values that are left are hue and saturation. The vectorscope shows hue by the positions around the “clock face” and saturation by distance from the center. Any color the system can produce is available at some point of “longitude” and “latitude” on the vectorscope…which is indicated by the values on the two axes…”B-Y/Pb/Cb” or “R-Y/Pr/Cr” and where they would then metaphorically intersect. It’s not an additive system like RGB.

    The color difference signals take VERY little space to encode, whereas with RGB, each color channel is equal and needs a complete raster. Not to mention that basic RGB actually has a few shades from typical television palettes that it can’t reproduce…just as color difference methods used within the established video standard has some difficulty with reproducing some shades of RGB.

    So…RGB won’t be taking over anytime soon, and I think anyone with less than the latest, most massive computer coupled to cavernous amounts of harddrive storage should be thankful for that.

    TimK,
    Director, Consultant
    Kolb Productions,

  • __Peter__ Create COW Profile Image

    __peter__

    June 3, 2010 at 1:49 pm

    Hello 🙂

    thanks! That was a very good explanation.

    Would You say that all of that is true for broadcast or also even for all the web formats ?

    Does h264, flash, MPEG2 and so on use YUV ?
    Or whatever one should call that colorsystem then.

    I thought if we are on a RGB device like computers are, RGB would be the first choice. But maybe not 🙂

    Peter

  • Brian Louis

    June 4, 2010 at 1:22 am

    [Peter Rixner] “Does h264, flash, MPEG2 and so on use YUV ? Or whatever one should call that colorsystem then.”
    when you refer to codecs you are refering to information carrying, the information that is carried is encoded into the data stream, depending on how the end product is going to be used when decoded, will determine the wrapper its carried in.
    You should goto wikipedia and read up on various video systems to get a better idea about what is going on.

We use anonymous cookies to give you the best experience we can.
Our Privacy policy | GDPR Policy