The Definition of High Definition

The Definition of High Definition

You're probably all too familiar with high definition jargon: HD, 4K, high resolution, 1080p, pixel scaling -- the list goes on. These terms are always thrown at us when shopping for a TV, but what do they actually mean? What's the science behind high definition?

 

Hi Def Means Hi Res

 

It's important to understand that high definition has everything to do with resolution. Resolution is a measure of the number of pixels being displayed on the screen. A resolution of 640 x 480 means the display has 640 pixels from left to right and 480 pixels from top to bottom, resulting in 307,200 total viewable pixels. High definition typically starts at 1280 x 720, also known as 720i or 720p. All definitions below that are considered standard definition. Full HD refers to 1080p or 1920 x 1080, which was the highest resolution available for a while, until TV manufacturers developed Ultra HD or 4K resolution, which we'll discuss later on.

 

1080i vs 1080p and Refresh Rate

 

1080i and 1080p both refer to a resolution size of 1920 x 1080, so what's the difference? The 'i' stands for interlace scan, and the 'p' stands for progressive scan, which are different ways in which the pixels in the video image refresh. With interlace scan, the screen refreshes half of the pixels (the even lines) at one time and the other half of the pixels (the odd lines) at one time. This means half of the pixels are not up to date, by a microsecond, which can cause a subtle blurring effect. Progressive scan refreshes all of the pixels at once, making it the highly preferred option over interlace scan.

 

Resolutions can vary not only in how they are refreshed, but how often they are refreshed, which is referred to as refresh rate and is measured in Hz. Hz refers to the number of times the pixels are refreshed per second, so a resolution of 1080p30 has a refresh rate of 30 Hz, or refreshes 30 times every second.

 

Interfaces

 

The resolution you're able to achieve depends on the interface. There are four interfaces largely used by consumers: Component, VGA, DVI, and HDMI.

 

Component: This interface is largely becoming obsolete, but it's still used sometimes. Component splits up the video signal into three RCA connectors with red, green, and blue signals. Component can achieve resolutions up to 1080p, but almost all modern HDTVs no longer feature component ports.

 

VGA: VGA, or Video Graphics Array, is another analog interface featuring a single connector with 15 pins. This interface has been used in computers since 1987, and still offers some advantages over digital interfaces, such as less limitation on distance. Newer VGA technology such as QXGA can achieve resolutions as high as 2048 x 1536.

 

DVI: Digital Video Interface runs a digital signal rather than an analog signal, meaning it uses advanced computing to assign color and brightness values to each individual pixel. DVI is a video-only signal, meaning it does not transmit audio at all.

 

HDMI: HDMI, or High Definition Multimedia Interface, is possibly the most commonly used interface today. It functions similarly to DVI but also transmits audio signals, making it a more versatile connector. The latest advancements in HDMI v1.3 allow for transfer rates of 340MHz, Deep Color Support, and high definition audio.

 

Beyond Full HD

 

For a while, Full HD was the limit on definition and resolution when it comes to hi-def displays. However, there are still ways to increase resolution and produce a more detailed and lifelike image. One of these ways is with frame rate. All standard video is recorded at 24 fps, or frames per second. Higher frame rates like 48 fps can produce a smoother video that is close to the natural limit of our eyes.

 

We also have developed Ultra HD resolutions, the most common being 4K. 4K, as the name implies, is four times the size of Full HD, with around 4,000 pixels on the horizontal side. The use of 4K at this point is a bit more complicated than Full HD for a number of reasons. Firstly, the TV screen has to be a certain size for 4K resolution to even be noticeable. A 42" 1080p is considered "retina" display quality when viewed from 10 feet away. This means it has more pixels than the human eye can perceive. So 4K resolution won't be perceivably different from Full HD unless you're seated close enough and the screen is big enough. Additionally, only some media sources offer 4K capabilities, so when you buy a 4K TV, you're still viewing 1080p quality most of the time, depending on the source.

 

Pixel Scaling

 

Every video is produced with a specific resolution, and every screen is built with a specific resolution. Pixel scaling occurs when the video image resolution doesn't match the TV's native resolution specs. A standard definition video is enlarged to fill the screen on an HDTV, otherwise it would only display on a small portion of the screen. Every computer, television, or any device with a visual display, does pixel scaling. This process only has to do with resizing images. There are other video processors that will perform more complex algorithms on a video signal to sharpen the image after rescaling.