Video Resolution: HD vs. UHD, 4K vs. 8K
Learn about the differences in video resolution for your TV or projector.
The “resolution” of your display matters. Resolution is defined as the ability to resolve fine details. The more pixels a display or projector uses to render pictures, the finer the detail you will be able to see (as long as you are close enough to the display that your eyes can see the fine detail). A more technically accurate term for describing digital screen resolutions is “picture size” (measured in horizontal and vertical pixels), but this is more commonly referred to as resolution.
Standard Definition (SD) Video
In the early days of television, the NTSC broadcast standard in the United States (and other countries) used 525 horizontal scan lines, of which only 480 lines were visible on the Cathode Ray Tube (CRT) television. The PAL standard in Europe used 625 horizontal lines, of which only 576 were visible on the CRT TV. These standards used interlaced video, which captured all of the odd lines in one field, and then all of the even lines in the next field, interlacing these fields to make a complete frame (at 29.97 frames per second, or 59.94 fields per second for NTSC). This was important to enable the refresh rate of a CRT television to be fast enough to avoid flicker. Interlaced video suffers from the combing effect on the edges of objects in motion horizontally, as each field is captured and displayed roughly 1/60th of a second earlier or later than the other field displayed on the screen at any given moment in time. Interlacing is not needed for modern digital displays, which display all pixels in a frame quickly, and each pixel can continue to display the desired light level without decay (unlike the phosphor coating on the inside of a Cathode Ray Tube, where the brightness starts to decay the moment after the electron beam scanning across each horizontal line passes).
As digital video standards started to emerge, Standard Definition Video in the NTSC countries was defined as 480i (interlaced), or 480p (progressive scan), which is 640 pixels wide by 480 pixels high, with a 4:3 aspect ratio.
High Definition (HD) Video
In the United States, the Advanced Television Standards Committee (ATSC) developed broadcast standards for High Definition. There were several signal formats incorporated in this standard.
720p = 1280 pixels wide x 720 pixels high, with a 16:9 aspect ratio. There are multiple possible frame rates, but 720p video was most commonly broadcast at 59.94 frames per second (fps).
1080i = 1920 pixels wide x 1080 pixels high, with a 16:9 aspect ratio (interlaced). This was typically broadcast at 29.97 frames per second (59.94 fields per second).
1080p = 1920 pixels wide x 1080 pixels high, with a 16:9 aspect ratio (progressive). This was typically broadcast at 29.97 frames per second.
As HD video evolved from an over-the-air (or cable/satellite) broadcast standard to a digital television standard, interlaced video became far less common, and earlier videos shot with interlaced video cameras would be deinterlaced to create progressive scan video.
Ultra High Definition (UHD, or 4K) Video
UHD video specifications were first proposed by NHK Science & Technology Research Laboratories and later standardized by the International Telecommunication Union (ITU). The Consumer Electronics Association announced on October 17, 2012, that the term “Ultra High Definition”, or “Ultra HD”, would be used for displays that have an aspect ratio of 16:9 or wider, capable of presenting video at a resolution of 3840 wide × 2160 high.
8K Video
8K video has a pixel resolution of 7680 wide x 4320 high, exactly twice the width and height (in pixels) of 4K UHD video. Although professionally produced 8K content (movies and TV shows) is very rare at this time, consumer video devices such as cell phones, cameras and TVs supporting 8K video are widely available.
When purchasing a TV, all things otherwise being equal, 8K is a beneficial feature. The key question to ask is “will I notice a difference between a 4K TV and an 8K TV?” Many experts believe that for most people, with a typical screen size at their normal viewing distance, the increased resolution of 8K versus 4K cannot be noticed. The experts base this opinion on the visual acuity of someone with 20/20 vision, which is often assumed to be the best possible vision a person could have. As a 2009 study by
August Colenbrander at the Smith-Kettlewell Eye Institute explains…
“…it is wrong to refer to “20/20” (1.0) vision as “normal”, let alone as “perfect” vision. Indeed, the connection between normal vision and standard vision is no closer than the connection between the standard American foot and the average length of “normal” American feet. The significance of the 20/20 (1.0) standard can best be thought of as the “lower limit of normal” or as a screening cut-off. When used as a screening test, we are satisfied when subjects reach this level and feel no need for further investigation, even though the average visual acuity of healthy eyes is 20/16 (1.25) or 20/12 (1.6).”
This article from RED Digital Cinema also explains how better video experiences come from being fully immersed in the experience, filling your central vision (40 to 60 degrees) with the picture.
It is a myth that 8K TVs will never be appreciated by anyone with normal vision at a normal viewing distance. However, to really notice a difference, you must be sitting relatively close to a very large screen. For example, an 85-inch 8K TV will have a noticeably better picture at a 7-foot viewing distance for anyone with average visual acuity (20/16 vision). It is also a myth that 8K video will never succeed. The technology is already here, and it is rapidly being optimized at every stage of the video ecosystem (cameras, production systems and software, encoding and distribution technologies, playback and display technology). It is only a matter of time before movies and TV shows will be widely available in 8K resolution. But the incremental benefit of 8K over 4K is much smaller than the benefit of 4K versus HD.