HDMI 2.1 is becoming more common on TVs and monitors, as users take advantage of higher resolutions, frame rates, and HDR content. So what's the fuss all about, and is it time to upgrade?
Most displays on the market currently support the HDMI 2.0b standard, which has a bandwidth cap of 18Gbps. That's enough to carry an uncompressed 4K signal at 60 frames per second at up to eight-bit color. This is adequate for the vast majority of uses, including watching UHD Blu-rays or playing games on a current or last-generation Xbox or PlayStation console.
The latest HDMI 2.1b standard adds support for an uncompressed 8K signal at 60 frames per second in 12-bit color. It achieves this with a bandwidth throughput of 48 Gbits per second. Using display stream compression (DSC), HDMI 2.1b can push a 10K signal at 120 frames per second in 12 bit.
Some monitors and TVs that support HDMI 2.1 use ports that only reach around 40Gbps. This is enough to handle a 4K signal at 120 frames per second in 10-bit color with HDR support. It's enough to take full advantage of the 10-bit panels on consumer-grade TVs.
HDMI 2.1 is aimed mostly at gamers who own current-generation consoles and graphics cards. To take advantage of a 4K signal at 120Hz, you'll need a TV that supports the standard.
Both the Xbox Series X and PlayStation 5 support 4K output at up to 120 frames per second with HDR support. NVIDIA's 30 and 40 series graphics cards feature HDMI 2.1 support, as do AMD's 6000 and 7000 series Radeon GPUs. Intel's ARC series of GPUs have spotty HDMI 2.1 support, which varies from card to card.
If your TV doesn't support HDMI 2.1, you'll have to make do with a 4K signal running at only(!) 60 frames per second. The majority of titles for the last console generation ran at 30 frames per second. Many current-generation Xbox Series X and PlayStation 5 games offer performance modes that target 60fps, but support for 120 frames remains rare.
One unique
Read more on howtogeek.com