Refresh Rate: Everything You Need to Know (And More)

If you're shopping for a TV or a projector for your living room or for a home theatre setup, you're probably aware that refresh rate is an important specification to look at. Those who use a gaming console and monitor also often play close attention to refresh rate. Outside of definition, refresh rate probably has the biggest impact on the viewing experience you can get from a device. So, what does refresh rate mean and how is it different from frame rate? What constitutes a "good" refresh rate?


What is Refresh Rate?

Refresh rate refers to the frequency at which a display (whether it be a TV, monitor, or projector) is able to refresh or update the pixels being displayed. When your TV receives video from a media source, refresh rate determines how quickly the screen displays each frame.

When it comes to the visual experience, while definition affects picture clarity, refresh rate affects the fluidity of motion. If the display is updated more times per second, then the transition between frames is going to look smoother and not so choppy.

Refresh rate is commonly confused with frame rate, which describes the quality of video content rather than the capabilities of the display. Frame rate refers to the number of frames that are sent to a display, and in the case of gaming, the number of frames that are generated by a GPU.

Refresh rate is measured in Hz, whereas frame rate is measured in FPS, or frames per second. The most common frame rates are 24 fps, 30 fps, and 60 fps. The standard for cinema is 24 fps, so if your home theatre is strictly for movie-watching, your refresh rate needs won't be very high. Higher frame rates are common for other online content, and when it comes to modern gaming, 60 fps has quickly become the standard.

Similarly to FPS, Hz describes the number of times the display is updated per second. So, a 30 Hz refresh rate means the screen is refreshed 30 times every second at a maximum. Common refresh rates on TVs and projectors are 60 Hz, 120 Hz, 144 Hz, and 240 Hz. So, how do these two measurements interact?

Well, you will only be able to see as many frames per second as your TV or projector can display. So, if you're watching 60 fps content on a screen with a 30 Hz refresh rate, you will effectively be watching 30 fps. The most important thing to note about refresh rate is that it needs to be equal to or higher than the frame rate of whatever you're viewing to get the full visual experience.

The next important interaction to understand is the divisibility of frame rate and refresh rate. You may have noticed that many of the common refresh rates are evenly divisible by common frame rates -- and that is for a good reason. If a display's refresh rate is different from the frame rate, it can result in an error called screen tearing, in which the screen displays information from two or more frames simultaneously.

While it's not harmful to your device, screen tearing is going to make the image look torn, distorted, and choppy. We don't want that. So, you always want to be viewing content with a refresh rate that is divisible by the frame rate (i.e. 30 fps @ 60 Hz, or 60 fps @ 60/120 Hz).

While screen tearing does occur sometimes, most people don't have to deal with it. Luckily, we have modern technology that lets us view content at many different frame rates without worrying about the compatibility with the refresh rate of our display. This tech is called VSync, or vertical synchronisation. VSync will cap a video's frame rate to one that is compatible with the display's refresh rate to ensure that no screen tearing occurs.

For example, a video of 50 fps may be capped to 30 fps for a 60 Hz display. It's not typically an issue for most media; VSync becomes more important for gaming at high frame rates. Games running above 60 fps are often capped to 60 for monitors with 60 Hz refresh rates. VSync may be automatically enabled on your TV or monitor, or you may have to enable it manually in the display settings.

Another common instance of screen tearing occurs when watching 24 fps cinema on a 60 Hz display. However, because 60 Hz updates so frequently compared to 24 fps, this level of screen tearing is hardly noticeable, and is referred to as "judder". To combat judder, a lot of movies are converted to 60 fps when released online to be more compatible with display devices.

More recently, newer monitor synchronisation technologies have also been released, such as FreeSync and G-Sync. These allow for a variable refresh rate rather than simply capping the frame rate, allowing for greater compatibility with content at high refresh rates such as 144 or 165 Hz. G-Sync, released by NVIDIA, is a special chip located within a display that communicates with a computer's graphics card. In fact, G-Sync can effectively allow a display to refresh only as soon as it receives the next frame, leaving no possible room for error or de-synchronisation at a variety of frame rates. FreeSync is AMD's version, and doesn't require a chip within the display device. A FreeSync-enabled AMD Radeon driver relies on the Adaptive-Sync spec in DisplayPort connections, which is an industry standard, to communicate with a display's firmware. FreeSync monitors tend to be more affordable, but occasionally can cause an effect known as "ghosting", so some gamers prefer using G-Sync.

If you're just here to learn what refresh rate is, that last bit was probably just a whirlwind of confusing information. Variable refresh rates largely only apply to gamers, so if you just plan on streaming, you don't have to worry about any of that *nonsense*.

What makes a "good" refresh rate?

At this point, you likely understand what refresh rate you need depending on the content you plan on viewing. For casual streamers and cinema-lovers, 60 Hz and 120 Hz are acceptable and reliable refresh rates. As such, almost all QHD and 4K displays have refresh rates of 60 Hz or higher.

One would think that a refresh rate equal to the frame rate would result in completely smooth video, since the display is updating at the same frequency of the number of frames being shown. This is technically true, and many consumers don't need a refresh rate any higher than 60 Hz. When it comes to gaming, however, some people see a difference between 60 Hz and higher refresh rates even when playing games at 60 fps. Some say higher refresh rates make the motion buttery smooth, while others say they can't tell the difference.

Additionally, it's important to note the relationship between definition and frame rate. As QHD and 4K are double or four times the size of standard HD, running these definitions at high frame rates becomes exponentially taxing on a device's hardware. Specifically, when it comes to running games at high definitions like 4K or QHD, sometimes a display with a refresh rate capability of 60 Hz will not consistently run at 60 Hz without error. When playing a game at 60 fps, this means screen tearing or being locked to 30 fps by VSync. A safer bet for consistent UHD gaming at 60 fps is a 120 Hz display or using G-Sync.

Many games are also capable of running at frame rates higher than 60 fps, such as 120 fps or even 240 fps. When it comes to first-person shooters played with other people in real time, a higher frame rate could be noticeable and technically can give you an advantage against other players. If you're looking for a high performance gaming monitor or TV, a refresh rate higher than 60 Hz might be in consideration if your GPU can produce a high frame rate.