Over the last year, the terms UHD and 4K have become so conflated that TV makers, broadcasters, and tech blogs are now using them interchangeably. Let me just say this now: 4K and UHD (Ultra HD) are not the same thing. Yes, as far as the consumer is concerned, there isn’t much of a practical difference — but, in much the same way that consumers got screwed over by “HD Ready” displays, it’s a good idea to clarify the difference between 4K and UHD before things get out of hand.
4K vs. UHD
The simplest way of defining the difference between 4K and UHD is this: 4K is a professional production and cinema standard, while UHD is a consumer display and broadcast standard. To discover how they became so conflated, let’s look at the history of the two terms.
The term “4K” originally derives from the Digital Cinema Initiatives (DCI), a consortium of motion picture studios that standardized a spec for the production and digital projection of 4K content. In this case, 4K is 4096×2160, and is exactly twice the previous standard for digital editing and projection (2K – 2048×1080). As you can see, 4K clearly refers to the fact that the vertical resolution (4096) is just over four thousand. The 4K standard is not just a resolution, either: It also defines how 4K content is encoded. A DCI 4K stream is compressed using JPEG2000, can have a bitrate of up to 250 megabits per second (Mbps), and 12-bit 4:4:4 color depth and image quality. (See: How digital technology is reinventing cinema.)
Ultra High Definition, or UHD for short, is the next step up from Full HD — the official name (that no one uses) for the display resolution of 1920×1080. UHD doubles that resolution to 3840×2160. It does not take a genius to see that 3840 is actually quite far away from four thousand. Almost every TV or monitor that you see advertised as “4K” is actually UHD. There are some panels out there that are 4096×2160 (aspect ratio 1.9:1), but the vast majority are 3840×2160 (1.78:1). If you displayed 4K content on one of these “4K” displays you would get letterboxing (black bars) down the left and right side of the screen. There isn’t yet a specification for how UHD content is encoded (which is one of the reasons there’s almost no UHD content in existence), but it’s unlikely to be the same quality as DCI 4K.
Why not 2160p?
Now, it’s not like TV makers aren’t aware of the differences between 4K and UHD — but for marketing reasons they seem to be sticking with 4K. So as to not conflict with the DCI’s actual 4K standard, some TV makers seem to be using the phrase “4K UHD,” though some are just using “4K.” To be fair, it is worth pointing out that UHD is actually split in two — there’s 3840×2160, but the next step up, 7680×4320, is also called UHD. It is fairly correct to refer to these two UHD variants as 4K UHD and 8K UHD — but, to be more correct, the 8K UHD spec should be renamed QUHD (Quad Ultra HD). (Read: 8K UHDTV: How do you send a 48Gbps TV signal over terrestrial airwaves?)
Ideally, though, to regain some kind of sanity and standardization, TV and monitor makers should abandon the 4K moniker entirely and instead use the designation 2160p. Display and broadcast resolutions have always referred to resolution in terms of horizontal lines, with the letters “i” and “p” referring to interlacing (skipping every other horizontal line) and progressive scan (no horizontal lines are skipped). 576i (PAL), 480i (NTSC), 576p (DVD), 720p, 1080i, 1080p — and so on.
It is crazy that we’re suddenly referring to TVs and monitors by their vertical resolution. It is crazy that Netflix, after announcing the availability of 1080p streams in some markets, then went on to say that it would soon trial 4K. How does that make sense, except in the heads of marketing goons?
The sad truth is, 4K and Ultra HD sound sexier and roll off the tongue easier than 2160p. Now that CES 2014 is here and there are 4K TVs everywhere you look, it would take a concerted effort from at least one big TV manufacturer to right the ship and abandon use of 4K in favor of UHD and 2160p. In all honesty, though, it is probably too late: 4K is already embedded in the minds of early adopters and tech writers, and dislodging it in favor of 2160p will be difficult.
The good news, as I said at the start of the story, is that there is very little practical difference — and really, let’s face it: 4K is just a name. It’s not like the vast majority of people care about the official DCI 4K spec. 2160p might be more sensible, but the main thing is whether “4K” communicates that a display or broadcast is 3840×2160 — and in my mind, it does.