WHAT IS THE DIFFERENCE BETWEEN 4K, FULL HD, ETC.

WHAT IS THE DIFFERENCE BETWEEN 4K, FULL HD, ETC.

Stella

In recent years, the evolution of screens has made such progress that few people can be surprised by a high-quality picture at Full HD resolution at 1920 x 1080 pixels with a frequency of 120 Hz. It seemed that what else is needed to enjoy viewing? But Full HD is no longer the ultimate dream. The market already sells models with might and main with 4K and Ulta HD resolution.



4K is a designation for the resolution in digital cinema and computer graphics, corresponding to approximately 4000 pixels horizontally.


A small note about the Frequency that is applicable to screens of different resolutions. What does it give? This is (roughly and simplistically speaking) the speed of your screen flickering and the associated delay between image frames. The lower the frequency, the worse the frame change is perceived: they become "ragged". On the contrary, the higher - the less the pause between frames, the image becomes smooth and well perceived.


THE DIFFERENCE BETWEEN HD, UHD, 4K AND 8K

Grid of modern video screen resolutions


High definition televisions (HDTVs) are a standard that has been in use for the past decade. Nowadays it is difficult to go into a store and buy a TV that is not at least HD Ready, which means “capable of displaying a resolution of 1280 x 720 pixels (720p)”. But the vast majority of modern TVs that you can buy are Full HD, which means "capable of displaying at 1920 x 1080 pixels (1080p)."


The letter "p" in both variants means progressive, which means that the entire image draws each line of the frame sequentially, and such lines are 720 or 1080, respectively. An alternative is the letter “i”, which stands for interlaced (1080i is the HDTV standard). Odd and even lines are displayed alternately in each frame, resulting in a slight degradation in image quality.


The term 4K refers to any display format with a horizontal resolution of approximately 4000 pixels. This is a bit confusing, since at lower resolutions the format is written as the number of pixels vertically, i.e. 1080i or 720p.


UHD or Ultra HD is the same as 4K, except for one thing: it is more suitable for consumers and television, and has a lower resolution of 3840x2160 pixels (2K) than 4K.


In this way:


(DCI) Standard Digital Cinema Initiatives is a professional production standard that is the most common for digital production at 4096 x 2160 dots.

UHD-1, often referred to as 4K UHD or simply 4K, and sometimes as 2160p, has become the standard for TVs with a resolution of 3840 x 2160 pixels, which is four times the number of pixels in Full HD.

Most modern UHD-1 TVs do not use the wider DCI 4K aspect ratio, as it is not suitable for most TV content.


There is also Full HD Ultra, sometimes referred to as 8K, with a resolution of 7620 x 4320 pixels. There are several TVs on Yandex Market that can boast of such a screen resolution and an impeccable picture, but it is recommended to choose a Full HD Ultra TV with a diagonal of at least 85 inches. On a smaller diagonal, you simply will not see all the splendor of the picture that is possible at 8K.


4K VS. UHD

The simplest way to tell the difference between 4K and UHD is that 4K is the professional production standard and UHD is the consumer display and broadcast standard. To find out why they are so often confused, let's take a look at the history of the two terms.


The term “4K” originally comes from the DCI (Digital Cinema Initiatives) consortium, which standardizes specifications for digital cinema production. In this case, 4K is a resolution of 4096 by 2160 pixels, which is four times more than the previous digital processing and projection standard (2K or 2048 x 1080). 4K is not only a resolution standard, it also defines how content is encoded. DCI 4K is compressed using JPEG 2000, can be up to 250Mbps and uses 12-bit 4: 4: 4 color depth.

Ultra High Definition, or UHD for short, is the next step in Full HD, the official name for a 1920 x 1080 display. UHD upscales the resolution to 3840 x 2160. This is not the same as 4K since 4K is higher, but nonetheless almost every UHD TV or monitor is advertised as 4K. Of course, there are panels with a resolution of 4096 x 2160 and an aspect ratio of 1.9: 1, but the vast majority are 3840 x 2160 with a ratio of 1.78: 1.


Why not 2160p?

Of course, manufacturers know the difference between 4K and UHD. But probably for marketing reasons, they stick to the term 4K. But in order not to conflict with the real DCI standard, they often use the phrase 4K UHD, although some write just 4K.


To make things even more confusing, UHD is actually split in two - 3840 x 2160 and then a significantly increased resolution of 7680 x 4320, also called UHD. The terms 4K UHD and 8K are used to distinguish them.

but, to be more precise, 8K UHD should be renamed QUHD (Quad Ultra HD). The real solution would be to abandon the 4K name and use the designation 2160r. Broadcast and display standards always use a lower value with the letters “i” or “p”, resulting in 720p, 1080i, 1080p, and so on.


Although it is likely too late to change the naming convention, as 4K TVs are already available worldwide. The more important issue is the lack of content for such permission. So far, only a few, like Netflix and Amazon Instant Video, offer something similar. But without high-quality and affordable video, 4K and UHD are unlikely to gain popularity in the near future.


IS IT WORTH TAKING ABOVE FULL HD?

It is worth it only if you have enough 4K or UHD video sources. These can be both video files and providers who can present such content. It should be noted that there are still few providers providing such content.


Apple has begun adding 4K HDR content to iTunes ahead of time. It began appearing on iTunes for the US and several other countries.

https://t.me/s/besttopreviews/1


Report Page