2K resolution is a horizontal display resolution of approximately 2,000 pixels on a display device or content. In television and consumer media, 1920 × 1080 is the most common 2K resolution, although 2048 × 1080 (DCI 2K) is more common in digital cinema workflows.
|Examples of 2K resolutions|
|DCI 2K (full frame)||2048 × 1080||≈1.90∶1||2,211,840|
|DCI 2K (flat cropped)||1998 × 1080||≈1.85∶1||2,157,840|
|DCI 2K (CinemaScope cropped)||2048 × 858||≈2.39∶1||1,755,136|
|QXGA||2048 × 1536||1.33∶1|
|WUXGA||1920 × 1200||1.60∶1|
|Full HD||1920 × 1080||1.77∶1|
What is 2k resolution? “2K resolution” is a generic categorical term which can include any number of specific resolutions. Some examples include:
- 2048 × 1556, the resolution used for 2K scans of 35 mm film
- 2048 × 1080, the resolution of the DCI 2K distribution standard
- 1998 × 1080, the resolution used for 1.85∶1 ratio 2K content in the DCI standard
The DCI 2K standard is the dominant 2K resolution used in the cinema industry. However, the term “2K” itself was not coined by DCI, and does not refer specifically to the DCI 2K standard. Usage of the term “2K” predates the publication of the DCI standard, and is generally understood to be a generic term referring to resolutions ≈2,000 pixels in width.(p110)
Many other resolutions commonly used in computer displays, such as 2048 × 1536 (QXGA), 1920 × 1200 (WUXGA), and 1920 × 1080 (Full HD) also fall into the category of 2K resolutions and could be described as such.(p110) However, the terminology “2K”, which originated in the cinema industry, is not commonly used in reference to these resolutions. Most discussion in the context of cinematography uses the term “HD” rather than 2K when referring specifically to 1920 × 1080 resolution material to distinguish it from other 2K resolutions commonly used in the industry.
What about the numbers: 720p, 1080p, 1440p, 2K, 4K and 8K?
When high-definition TVs became the norm, manufacturers developed a shorthand to explain their display resolution. The most common numbers you see are 720p, 1080p, 1140p or 4K. As we have seen, the “p” and the “i” tell you whether it is a progressive-scan or an interlaced-scan display. Moreover, these shorthand numbers are sometimes used to describe computer monitors as well, even though in general a monitor is capable of a higher definition display than a TV. The number always refers to the number of horizontal lines on the display.
Here’s how the shorthand translates:
- 720p = 1280 x 720 – is usually known as HD or “HD Ready” resolution
- 1080p = 1920 x 1080 – is usually known as FHD or “Full HD” resolution
- 1440p = 2560 x 1440 – is commonly known as QHD or Quad HD resolution, and it is typically seen on gaming monitors and on high-end smartphones. 1440p is four times the resolution of 720p HD or “HD ready.” To make things even more confusing, many premium smartphones feature a so-called 2960×1440 Quad HD+ resolution, which still fits into 1440p.
- 4K or 2160p = 3840 x 2160 – is commonly known as 4K, UHD or Ultra HD resolution. It is a huge display resolution, and it is found on premium TVs and computer monitors. 2160p is called 4K because the width is close to 4000 pixels. In other words, it offers four times the pixels of 1080p FHD or “Full HD.”
- 8K or 4320p = 7680 x 4320 – is known as 8K and it offers 16 times more pixels than the regular 1080p FHD or “Full HD” resolution. For now, you see 8K only on expensive TVs from Samsung and LG. However, you can test whether your computer can render such a large amount of data using this 8K video sample:
What is the Aspect Ratio?
The term aspect ratio was initially used in motion pictures, indicating how wide the picture was in relation to its height. Movies were initially in 4:3 aspect ratio, and this carried over into television and early computer displays. Motion picture aspect ratio changed much more quickly to a wider screen, which meant that, when movies were shown on TV, they had to be cropped or the image had to be manipulated in other ways to fit the TV screen.
The same picture in 16:9 vs 4:3 aspect ratio
As display technology improved, TV and monitor manufacturers began to move toward widescreen displays as well. Originally “widescreen” referred to anything wider than the typical 4:3 display, but it quickly came to mean a 16:10 ratio and later 16:9. Nowadays, nearly all computer monitors and TVs are only available in widescreen, and TV broadcasts and web pages have adapted to match.
Until 2010, 16:10 was the most popular aspect ratio for widescreen computer displays. However, with the rise in popularity of high definition televisions, which were using high definition resolutions such as 720p and 1080p and made these terms synonyms with high-definition, 16:9 has become the high-definition standard aspect ratio.
Depending on the aspect ratio of your display, you can use only resolutions that are specific to its width and height. Some of the most common resolutions that can be used for each aspect ratio are the following:
- 4:3 aspect ratio resolutions: 640×480, 800×600, 960×720, 1024×768, 1280×960, 1400×1050, 1440×1080, 1600×1200, 1856×1392, 1920×1440, and 2048×1536.
- 16:10 aspect ratio resolutions: 1280×800, 1440×900, 1680×1050, 1920×1200, and 2560×1600.
- 16:9 aspect ratio resolutions: 1024×576, 1152×648, 1280×720 (HD), 1366×768, 1600×900, 1920×1080 (FHD), 2560×1440 (QHD), 3840×2160 (4K), and 7680 x 4320 (8K).
- 1080p Full HD – digital video format with a resolution of 1920 × 1080