Google says it dropped the energy cost of AI queries by 33x in one year

So far this year, electricity use in the US is up nearly 4 percent compared to the same period the year prior. That comes after decades of essentially flat use, a change that has been associated with a rapid expansion of data centers. And a lot of those data centers are being built to serve the boom in AI usage. Given that some of this rising demand is being met by increased coal use (as of May, coal’s share of generation is up about 20 percent compared to the year prior), the environmental impact of AI is looking pretty bad.

But it’s difficult to know for certain without access to the sorts of details that you’d only get by running a data center, such as how often the hardware is in use, and how often it’s serving AI queries. So, while academics can test the power needs of individual AI models, it’s hard to extrapolate that to real-world use cases.

By contrast, Google has all sorts of data available from real-world use cases. As such, its release of a new analysis of AI’s environmental impact is a rare opportunity to peer a tiny bit under the hood. But the new analysis suggests that energy estimates are currently a moving target, as the company says its data shows the energy drain of a search has dropped by a factor of 33 in just the past year.

Read full article

Comments