Are you ready for yet another battlefield in the war between AMD and Nvidia? Well, it turns out you can get very different results when driving certain HDR monitors depending on whether you are using AMD or Nvidia graphics hardware.
As Monitors Unboxed(opens in new tab) explains, the reasons for this are complex. It's not necessarily that AMD or Nvidia is better. But it certainly adds a layer of complexity to the whole «what graphics card do I buy?» conundrum. As if things weren't already complicated enough, what with ray tracing, FSR versus DLSS, and all the other stuff you have to weigh up when choosing a new graphics cards.
The investigation here centres on the Alienware 34 AW3423DWF. That's the slightly cheaper version of Alienware's 34-inch OLED gaming monitor which ditches Nvidia's G-Sync tech for more generic adaptive refresh and AMD FreeSync.
The G-Sync equipped non-F Alienware 34 AW3423DW(opens in new tab) actually performs differently with HDR, which gives you an idea of how complicated this can all get. Anyway, the problem involves the AW3423DWF's performance when using the HDR 1000 mode.
That's the mode you need to use to achieve Alienware's claimed peak brightness of 1000 nits, as opposed to the True Black HDR mode, which tops out at just 400 nits. By default, the HDR 1000 mode simply ramps up the brightness of everything on screen.
That's not actually ideal. Instead, HDR 1000 should increase the brightness of the brightest objects but leave darker objects that are intended to hit brightness levels below 400 nits alone. That's the point of HDR—to increase the contrast between bright and dark objects, not just ramp up the overall brightness.
The difference between AMD and Nvidia GPUs comes when you attempt
Read more on pcgamer.com