NVIDIA's monopoly over the AI markets is now being questioned by several analysts, who claim that increasing GPU power consumption won't be sustainable in the future.
We all seem well aware of how NVIDIA has played its cards out in the AI segment, whether through gaining popularity among clients through its hardware arsenal or even witnessing immense interest from startups.
Everyone out there eyes NVIDIA as the primary source of their hardware, but it looks like some are more worried about the power consumption, as not only has the firm's share pricing dropped the largest in its history, but many analysts see the current approach as unsustainable and here is why.
Korean media reports that while NVIDIA has managed to see widespread adoption of its cutting-edge AI accelerators, the fact that concerns analysts out there is the growing power consumption. While average consumers know a data center equipped with thousands of NVIDIA accelerators as something "amusing," it is quite the opposite regarding the environmental effects. Team Green's A100s and H100 AI GPUs are known to be power-hungry, and when combining millions of them, imagine the power they would require.
A semiconductor analyst believes global data center power consumption is expected to increase to 85 to 134 terawatt hours by 2027, equivalent to what nations such as the Netherlands, Argentina, and Sweden consume. The analysts believe that if NVIDIA were to prioritize such power-hungry GPUs into the future, low-power alternatives would eventually gain traction, potentially outrunning NVIDIA, which is alarming for the company regarding its established monopoly.
In previous coverage, we disclosed how NVIDIA's existing AI accelerators are rumored to have been consuming up to
Read more on wccftech.com