Video/TV Resolutions explained – (720p, 1080p, WUXGA, 2K, 4K, 8K…)
What do they all mean?
If you are considering buying a TV (LCD, LED, OLED or QLED) for your home or a Computer Monitor Screen for your work, there are a few factors you’ll have to put into consideration before you take that step. Often times people who are not tech savvy, all they look out for is the price, the size (32, 40, 49, 55, or 60 inches), the resolution (though not that much) and the shape of the screen (flat or curved). However, some people don’t really pay much attention to the resolution simply because in the past, the nomenclature used for the different resolutions was quite confusing. Well this blog post intends to answer some of the questions you may have regarding the different display resolutions…
Let’s begin by looking at a few terminologies…
What is a Display Resolution?
A display resolution of a digital/analog television, computer monitor or any other display device is basically the number of distinct pixels (tiny dots on the screen) in each dimension (that is, in width and height) that can be displayed. For-instance a resolution of (1600 x 1200) pixels means 1,600 horizontal pixels and 1,200 vertical pixels.
What is a Display Aspect Ratio?
A display aspect ratio of any display device is basically the proportional relationship between the width and the height in-other-words, the ratio between the width and the height of any display. The aspect ratio of any device is often times expressed as two numbers separated by a colon that is (X:Y). In this case “X” represents units wide and “Y” represent units high. Some of the most common aspect ratios used include; (1:1) used by some social media sites, (4:3) common in most old analog TVs, (16:9) and (16:10) which are common in most widescreen HD TVs.
What’s the difference between LCD and LED TVs?
LCD stands for “Liquid Crystal Display” whereas LED stands for “Light Emitting Diodes”. About the difference between the two, well LED TVs are basically LCD TVs but not technically, wait, did I just confuse you? Let me explain…. Older LCD Monitors feature a layer of liquid crystal solution held between two pieces of polarized panels (glass) and are usually backlit by cold cathode fluorescent lamps (CCFLs) whereas LED monitors feature the same Liquid crystal display but the backlighting is produced by an array of smaller and more efficient light emitting diodes instead of the fluorescent lamps and since this technology is better, all LCD TVs now use LED lights and are informally referred to as LED TVs.
So why use light emitting diodes instead of any other technology? Well like we’ve said, LEDs are small and more efficient, but they also use less power, they provide a brighter display with better contrast and they also dissipate less heat as compared to the CCFLs in LCDs. One thing you should not however is that the display of an LED TV is not an LED display but an LCD display so technically we should call them “LED-backlit LCD Televisions”. This sounds confusing, we’ll stick to LED TVs.
What’s the difference between LED TVs and Plasma TVs?
While LED TVs use light emitting diodes as backlights for the screen, Plasma screen TVs light themselves using gas cells that emit ultra-violet light. LED TVs are slimmer, easily available but expensive whereas Plasma TVs on the other hand are believed to have a better picture quality but they’re less energy-efficient and usually available in larger sizes.
What’s the difference between NTSC, PAL and SECAM?
These are all analog color encoding systems that affect the visual quality of content viewed on analog televisions and HDTVs (to a much smaller degree). NTSC stands for (National Television Standards Committee), PAL stands for (Phase Alternating Line) and SECAM stands for (Sequential Color Memory). NTSC is mainly used in the US, Canada, Japan, South Korea, Mexico, Central and South America and some other countries. PAL is mainly in the UK, most countries in Europe, Africa, Australia, the Middle East, India and some other countries whereas SECAM is used in Eastern Europe and France.
What’s the difference between OLED TVs and QLED TVs?
Currently (2022) the two technologies competing at the premium end of the TV market are OLED and QLED technologies from LG and Samsung respectively. So what exactly is the difference between these two? Well here is the difference:
OLED which stands for “Organic Light Emitting Diodes” is a type of display technology that consists of a carbon-based film through which two conductors pass a current, causing it to emit light. OLED panels are lighter and thinner than a typical LCD/LED TV with their viewing angles significantly wider and also have a quick response time. However, one disadvantage with OLEDs is that, they are comparatively expensive to produce.
QLED according to Samsung stands for “Quantum-dot Light Emitting Diodes” is a type of display technology that uses Quantum dots to create a broad color spectrum similar to the color and brightness that we experience in real life.
In a nutshell, both of these technologies offer sensational picture quality with improved overall viewing experiences to standard LED TVs. OLEDs have the capability of controlling each individual pixel which allows it to display the deepest blacks without any backlight distortion whereas QLEDs have a wider color spectrum and can achieve a higher brightness level.
That’s all about the different TV terminologies that I thought were useful for any tech enthusiast out there. Let’s now get into knowing what all these different types of resolutions mean.
What are the different types of Resolutions?
- 720p (HD);
This type of resolution is also referred to as High Definition or simply HD (the “p” stands for Progressive Scan) with a horizontal and vertical pixels of (1280 x 720) and a display aspect ratio of (16:9) normally known as widescreen HDTV (1.78:1). This resolution (720p) is common on older TVs and many 32-inch flat-screen TV models on the market today.
- 1080p (FHD);
This type of resolution is also referred to as Full High Definition or simply FHD (the “p” stands for Progressive Scan) with a horizontal and vertical pixels of (1920 x 1080) and a display aspect ratio of (16:9). This resolution (1080p) is common in recent and slightly larger consumer-grade TVs typically 49 inches or smaller. Other applications of this standard include; television broadcasts, smartphones, Blu-ray discs, computer monitors, projectors, internet content such as YouTube videos and Netflix TV shows/movies.
- VGA, SVGA, XGA, WXGA, WUXGA;
VGA which stands for Video Graphics Array is an IBM display standard with a horizontal and vertical resolution of (640 x 480) pixels and a display aspect ratio of (4:3). This term is also often used to refer to the computer display standard and the 15-pin VGA connector. The VGA analog standard has been extended to support high-definition video up to resolutions of at least 1080p.
SVGA which stands for Super Video Graphics Array is a broad term that covers a wide range of computer display standards that extended the VGA specification and when used a shorthand for a resolution, it refers to a horizontal and vertical resolution of (800 x 600) pixels with a display aspect ratio of (4:3).
XGA which stands for Extended Graphics Array is a subset of the broad range of resolutions covered under the “Super VGA” umbrella. As a display standard, it has a horizontal and vertical resolution of (1024 x 768) pixels and a display aspect ratio of (4:3) with square pixels.
WXGA which stands for Widescreen Extended Graphics Array is a set of non-standard resolutions derived from the XGA display standard by widening it to a widescreen aspect ratio. WXGA is generally understood to refer to a horizontal and vertical resolution of (1366 x 768) pixels and a display aspect ratio of nearly (16:9). Nearly two decades ago, in the consumer entertainment industry, this resolution was most popular on LCD TVs versus XGA on flat panels and plasma TVs. WXGA is also used to describe a resolution of (1280 x 800) pixels with a display aspect ratio of (16:10) when referring to laptop displays (with diagonal screen size between 12 and 15 inches) and projectors intended primarily for use with computers.
WUXGA which stands for Widescreen Ultra Extended Graphics Array is a wide version of UXGA (Ultra Extended Graphics Array) and has a display resolution of (1920 x 1200) pixels with a display aspect ratio of (16:10). This resolution can be used for viewing content that uses a 16:9 aspect ratio and a 720p or 1080i or 1080p resolution. WUXGA was described as the standard screen resolution for modern computing in 2019.
- 2K Resolution;
Display devices or content having horizontal resolution of approximately 2,000 pixels are often referred to as 2K resolution devices. In cinema, the Digital Cinema Initiatives is the dominant standard for 2K output and defines 2K resolution with a horizontal and vertical resolution of (2048 x 1080) pixels with a display aspect ratio of nearly (17:9).
In television and consumer media, (1920 x 1080) is the most 2K resolution if we are considering the horizontal pixels since its approximately 2000 pixels but this is normally referred to as 1080p or Full HD (FHD). We can also see that 1080p has the same vertical resolution as the DCI 2K resolution (1080 pixels) but compared to the horizontal resolution, 1080p has a 7% smaller horizontal resolution below the range of 2K resolution formats.
Often times you’ll find 2K monitors with a display horizontal and vertical resolution of (2560 x1440) pixels shortened to 1440p (a family of video display resolutions with a vertical resolution of 1440p) but however, this resolution is officially considered and referred to as Quad HD (QHD) or Wide Quad HD (WQHD) with a vertical resolution double that of 720p which sits between 1080p (Full HD) and 4K (Ultra HD). This resolution is often used in smartphone displays, laptop computers and gaming consoles.
- 4K (UHD);
The term “4K” is a generic term that refers to any resolution with a horizontal resolution (pixel count) the falls under or is approximately 4,000 pixels. Just like 2K, 4K has several different resolutions for digital television and digital cinematography respectively hat have been standardized by various organizations.
In television and consumer media, (3840 x 2160) or (4K Ultra HD) is the dominant 4K standard resolution and the 4K television market share increased in 2014 and 2015 as their prices fell dramatically whereas in the movie projection industry, (4096 x 2160) or (DCI 4K) is the dominant 4K standard as stipulated by the Digital Cinema Initiatives, a prominent standards organization in the cinema industry.
Now this is where the confusion comes in, when we are talking about digital cinema resolutions, we are only specifying the horizontal resolution and yet on the other hand TVs have historically used the vertical resolution to describe resolution for-example 1080p is a vertical resolution. Now all of a sudden we are talking about 2K, 4K TVs which refers to the horizontal resolution. Well this confusion started a long time ago and still exists, we too don’t why they keep interchanging but we do think they’ll eventually stick to something that’s uniform and we also think it’s also part of the reason as to why they refer to them as “2K”, “4K” and “8K” because these term sound high-tech and impressive.
At the moment, most high-end TVs on the market are branded 4K or Ultra HD and you can also find 4K content in many places such as, most major streaming services like Amazon, YouTube, Netflix, gaming consoles like the PS4 and the Xbox one.
- 8K Resolution;
The term “8K” is a generic term that refers to any resolution with a horizontal resolution (pixel count) the falls under or is approximately 8,000 pixels. 8K or (8K UHD) has a horizontal and vertical resolution of (7680 x 4320) pixels with a display aspect ratio of (16:9) and it’s the highest resolution defined in the Rec.2020 (UHDTV) standard.
This display resolution is the successor to the 4K resolution and the first 8K TVs were unveiled in 2019 at the Computer Electronics Show (CES 2019) an annual trade show organized by the Consumer Technology Association (CTA) that typically hosts presentations of new products and technologies in the Consumer Electronics industry.
At the moment (2022), only a few cameras have the capability to shoot videos in 8K with Japanese NHK being the first company to have created a broadcasting camera with an 8K image sensor. Red Digital Cinema is also another company that was able to release 8K cameras in 2018 and until major content sources are available, 8K is speculated to become a mainstream consumer display resolution by the year 2023. Despite this, filmmakers are capturing better 4K footage and are pushing demand for 8K cameras.
Conclusion;
We should all note that resolution is not the most important factor when it comes to picture quality. So just because you have a TV that has a higher resolution than the other, it doesn’t necessarily mean its picture will look better. Well it might, but not always simply because there are some other factors that contribute to a better picture quality which include; better High Dynamic Range (HDR) performance, better overall contrast ratio and better color. A TV with these specifications, might look better than one that has more pixels (resolution).