Variable Refresh Rate (VRR) is a great feature that prevents screen-tearing and stutter from unstable frame rates in games. This feature has made it into the latest consoles, but for some, turning it on makes the picture worse. Why?
We’ve written an in-depth explanation on how VRR works, but the short version is simple to understand: Your TV or monitor has a refresh rate. The most common refresh rate is 60Hz, which means that the screen can display 60 unique frames of video every second. If you’re watching a video, then the frame rate is fixed and pre-recorded. If your display gets 30 frames a second of video it can perfectly display them on a 60Hz display by showing the same frame twice in a row. Movies that are at 24 frames per second don’t display perfectly on most displays, but since it’s a common cinematic frame rate, all televisions have some way of dealing with that content, with varying levels of success.
Video games are very different from fixed video content. The GPU in the console or PC isn’t under a consistent load. For example, when there are lots of explosions and heavy effects on-screen, the GPU might only produce 40 frames a second while there’s so much going on, which can lead to all sorts of visual artifacts or choppy motion.
VRR technology lets the gaming system talk to the display and varies the refresh rate to match the number of frames the GPU is actually producing. There’s HDMI VRR, NVIDIA G-SYNC, and AMD FreeSync. Both the device and display have to support the same standard for it to work, and each technology has a specific range where it can operate. If frame rates get too low, they’ll go below the minimum refresh the display can handle.
The PlayStation 5 and Xbox Series consoles support HDMI 2.1,
Read more on howtogeek.com