Tech giants are racing to ward off a carbon time bomb caused by the massive data centers they're building around the world.
A technique pioneered by Google is gaining currency as more power-hungry artificial intelligence comes online: Using software to hunt for clean electricity in parts of the world with excess sun and wind on the grid, then ramping up data center operations there. Doing so could cut carbon and costs.
There's an urgent need to figure out how to run data centers in ways that maximize renewable energy usage, said Chris Noble, co-founder and chief executive officer of Cirrus Nexus, a cloud-computing manager tapping data centers owned by Google, Microsoft and Amazon.
The climate risks sparked by AI-driven computing are far-reaching — and will worsen without a big shift from fossil fuel-based electricity to clean power. Nvidia Corp. Chief Executive Officer Jensen Huang has said AI has hit a “tipping point.” He has also said that the cost of data centers will double within five years to power the rise of new software.
Already, data centers and transmission networks each account for up to 1.5% of global consumption, according to the International Energy Agency. Together, they're responsible for emitting about as much carbon dioxide as Brazil annually.
Hyperscalers — as the biggest data center owners like Google, Microsoft and Amazon are known — have all set climate goals and are facing internal and external pressure to deliver on them. Those lofty targets include decarbonizing their operations.
But the rise of AI is already wreaking havoc on those goals. Graphics processing units have been key to the rise of large language models and use more electricity than central processing units used in other forms of computing. Training an AI model uses the more power than 100 households in a year, according to IEA estimates.
“The growth in AI is far outstripping the ability to produce clean power for it,” he said.
Moreover, AI's energy consumption is volatile and more
Read more on tech.hindustantimes.com