The worst-kept secret that is the Nvidia GeForce RTX 4070(opens in new tab) is just around the corner. So, here comes AMD trying to get its punches in first by trash talking the new GPU's VRAM allocation. Not necessarily overtly, but by implication. And graphs.
AMD has a new blog post(opens in new tab) demonstrating how it sees the impact of 12GB of VRAM versus 16GB when gaming at 4K. Needless to say, AMD's published numbers for 12GB look ugly (via Tom's Hardware(opens in new tab)).
Among the GPUs AMD uses to make the comparison are its own RX 6800 XT(opens in new tab) and Nvidia's RTX 3070 Ti(opens in new tab). The former runs 16GB, the latter 12GB. But with the RTX 4070 supposedly days away from launch and every leak under the sun expecting it too to offer 12GB of graphics memory, the timing of this critical salvo from AMD is unlikely to be a coincidence.
Intriguingly and as is increasingly becoming clear, AMD shows how enabling ray tracing often sees a big jump in VRAM usage. The ironic consequence of that? Enabling what is traditionally a competitive advantage for Nvidia can see VRAM usage exceed 12GB and turn the tables, leading AMD GPUs to out-perform Nvidia GPUs with ray tracing enabled.
Just to absolutely hammer home the point, AMD also shows advantageous numbers for its new 20GB RX 7900 XT(opens in new tab) versus the 12GB RTX 4070 Ti(opens in new tab) and a few other GPU combos where the AMD board delivers a lot more VRAM for the money.
The joker card in all of this is, of course, upscaling. Running a lower resolution and upscaling reduces VRAM usage and if Nvidia's DLSS scaling tech wasn't already very important, it could become absolutely critical in the near future and not just for the RTX 4070.
Current
Read more on pcgamer.com