It's easy to slate AI in all its manifestations—trust me, I should know, I do so often enough—but some recent research from Epoch AI (via TechCrunch) suggests that we might be a little hasty if we're trashing its energy use (yes, that's the same Epoch AI that recently dropped a new, difficult math benchmark for AI). According to Epoch AI, ChatGPT likely consumes just 0.3 Wh of electricity, «10 times less» than the popular older estimate which claimed about 3 Wh.
Given a Google search amounts to 0.0003 kWh of energy consumption per search, and based on the older 3 Wh estimate, two years ago Alphabet Chairman John Hennessey said that an LLM exchange would probably cost 10 times more than a Google search in energy. If Epoch AI's new estimate is correct, it seems that a likely GPT-4o interaction actually consumes the same amount of energy as a Google search.
Server energy use isn't something that tends to cross most people's minds while using a cloud service—the 'cloud' is so far removed from our homes that it seems a little ethereal. I know I often forget there are any additional energy costs at all, other than what my own device consumes, when using ChatGPT.
Thankfully I'm not a mover or a shaker in the world of energy policy, because of course LLM interactions consume energy. Let's not forget how LLMs work: they undertake shedloads of data training (consuming shedloads of energy), then once they've been trained and are interacting, they still need to pull from gigantic models to process even simple instructions or queries. That's the nature of the beast. And that beast needs feeding energy to keep up and running.
It's just that apparently that's less energy than we might have originally thought on a per-interaction basis: «For context, 0.3 watt-hours is less than the amount of electricity that an LED lightbulb or a laptop consumes in a few minutes. And even for a heavy chat user, the energy cost of ChatGPT will be a small fraction of the overall electricity
Read more on pcgamer.com