Some of 2023's most anticipated PC games have had their fair share of troubles. Hogwarts Legacy(opens in new tab) and The Last of Us Part 1(opens in new tab) are just two that ran horribly on cards with insufficient VRAM. It seems like a problem that's here to stay. Even if you have enough graphics memory right now, will it be enough to handle the demands of games in one, two, or three years from now?
There is some good news on the horizon thanks to Nvidia. It's working on a new compression technology it calls Neural Texture Compression, and like most of the tech coming out of Nvidia these days, it's thanks to AI.
According to Nvidia(opens in new tab) (via Hot Hardware(opens in new tab)), the new method allows material textures to store up to 16x more data in the same space than traditional block-based compression methods. This should allow developers to shrink the size of textures without any loss of quality. That means less need for huge amounts of graphics memory, which sounds good to me.
Nvidia claims the NTC algorithm offers superior image quality compared to modern algorithms including AVIF and JPEG XL. NTC can make use of general purpose GPU hardware and the Tensor cores of current gen Nvidia hardware, and can do so in real time. AVIF and JPEG XL image compression requires dedicated hardware and isn't designed for real time decompression.
Best CPU for gaming(opens in new tab): The top chips from Intel and AMDBest gaming motherboard(opens in new tab): The right boardsBest graphics card(opens in new tab): Your perfect pixel-pusher awaitsBest SSD for gaming(opens in new tab): Get into the game ahead of the rest
It's important to emphasize that this is a nascent technology. In the conclusion of the paper, Nvidia's
Read more on pcgamer.com