Best Security Systems Installers
2K resolution is a horizontal display resolution of approximately 2,000 pixels on a display device or content. In television and consumer media, 1920 × 1080 is the most common 2K resolution, although 2048 × 1080 (DCI 2K) is more common in digital cinema workflows.
Examples of 2K resolutions | |||
Format | Resolution | Ratio | Pixels |
DCI 2K (full frame) | 2048 × 1080 | ≈1.90∶1 | 2,211,840 |
DCI 2K (flat cropped) | 1998 × 1080 | ≈1.85∶1 | 2,157,840 |
DCI 2K (CinemaScope cropped) | 2048 × 858 | ≈2.39∶1 | 1,755,136 |
QXGA | 2048 × 1536 | 1.33∶1 (4∶3) | 3,145,728 |
WUXGA | 1920 × 1200 | 1.60∶1 (16∶10) | 2,304,000 |
Full HD | 1920 × 1080 | 1.77∶1 (16∶9) | 2,073,600 |
What is 2k resolution? “2K resolution” is a generic categorical term which can include any number of specific resolutions. Some examples include:
The DCI 2K standard is the dominant 2K resolution used in the cinema industry. However, the term “2K” itself was not coined by DCI, and does not refer specifically to the DCI 2K standard. Usage of the term “2K” predates the publication of the DCI standard, and is generally understood to be a generic term referring to resolutions ≈2,000 pixels in width.(p110)
Many other resolutions commonly used in computer displays, such as 2048 × 1536 (QXGA), 1920 × 1200 (WUXGA), and 1920 × 1080 (Full HD) also fall into the category of 2K resolutions and could be described as such.(p110) However, the terminology “2K”, which originated in the cinema industry, is not commonly used in reference to these resolutions. Most discussion in the context of cinematography uses the term “HD” rather than 2K when referring specifically to 1920 × 1080 resolution material to distinguish it from other 2K resolutions commonly used in the industry.
When high-definition TVs became the norm, manufacturers developed a shorthand to explain their display resolution. The most common numbers you see are 720p, 1080p, 1140p or 4K. As we have seen, the “p” and the “i” tell you whether it is a progressive-scan or an interlaced-scan display. Moreover, these shorthand numbers are sometimes used to describe computer monitors as well, even though in general a monitor is capable of a higher definition display than a TV. The number always refers to the number of horizontal lines on the display.
Here’s how the shorthand translates:
The term aspect ratio was initially used in motion pictures, indicating how wide the picture was in relation to its height. Movies were initially in 4:3 aspect ratio, and this carried over into television and early computer displays. Motion picture aspect ratio changed much more quickly to a wider screen, which meant that, when movies were shown on TV, they had to be cropped or the image had to be manipulated in other ways to fit the TV screen.
The same picture in 16:9 vs 4:3 aspect ratio
As display technology improved, TV and monitor manufacturers began to move toward widescreen displays as well. Originally “widescreen” referred to anything wider than the typical 4:3 display, but it quickly came to mean a 16:10 ratio and later 16:9. Nowadays, nearly all computer monitors and TVs are only available in widescreen, and TV broadcasts and web pages have adapted to match.
Until 2010, 16:10 was the most popular aspect ratio for widescreen computer displays. However, with the rise in popularity of high definition televisions, which were using high definition resolutions such as 720p and 1080p and made these terms synonyms with high-definition, 16:9 has become the high-definition standard aspect ratio.
Depending on the aspect ratio of your display, you can use only resolutions that are specific to its width and height. Some of the most common resolutions that can be used for each aspect ratio are the following:
See also