Worldwide energy consumption could increase if artificial intelligence (AI) were to be incorporated into everyday search engine use utilizing the current technology available, according to research published Tuesday in the peer-reviewed scientific journal Joule.
The written commentary assesses how AI’s energy footprint could change as developers create new tools. It was authored by Alex de Vries, who is identified in the journal as a Ph.D. candidate at the Vrije Universiteit Amsterdam School of Business and Economics and the founder of the research company Digiconomist.
De Vries notes in the introduction that the last two years have seen “a period of rapid expansion and extensive, large-scale application” of AI, which “raises concerns about the electricity consumption and potential environmental impact of AI and data centers.”
While the amount of energy needed to power the kinds of data centers that make AI possible has been “relatively stable,” de Vries suggests increased AI use and interest in its development could upend that trend. As an example, de Vries raises the possibility of Google parent company Alphabet adding AI capabilities to every Google search conducted. De Vries points to a Reuters interview with Alphabet Chairman John Hennessy published earlier this year, in which Hennessy said engaging with AI large language models (LLMs) may cost 10 times more than conducting basic searches.
In contrast with a “standard” Google search, which de Vries says uses an estimated 0.3 watt-hours (Wh) of electricity, an AI-fueled Google search would thus use 10 times that amount, or 3 Wh.
Before AI’s recent rise in popularity, it was only linked to 10 or 15 percent of Google’s electricity consumption in 2021, according to de Vries. Moving forward, de Vries writes: “The worst-case scenario suggests Google’s AI alone could consume as much electricity as a country such as Ireland (29.3 TWh per year), which is a significant increase compared to its historical AI-related energy consumption.”
Despite these concerns, de Vries is quick to add that the “worst-case scenario” he outlines would depend on “full-scale AI adoption” and the use of currently available technology. Instead, it’s likely widespread AI use won’t be immediate—and as the world catches on, technology is likely to evolve so that it can support AI in an efficient way.
“In summary, while the rapid adoption of AI technology could potentially drastically increase the energy consumption of companies such as Google, there are various resource factors that are likely to prevent such worst-case scenarios from materializing,” the commentary says. De Vries concludes by advising that AI developers and users keep energy consumption in mind while considering how they can and must use the technology, to avoid needless use.
What exactly AI will be capable of accomplishing is still largely unknown. Earlier this week, a computer scientist known as the “Godfather of AI” told 60 Minutes that AI could prove useful in health care and drug development, but could potentially pose employment and law enforcement bias threats, or even one day “take over” as a force more intelligent than humans.
Meanwhile, governments around the world are looking into strategies for AI regulation, with many technology leaders seemingly on board with the idea of introducing some guardrails for future development.
When it comes to energy consumption, the International Energy Agency (IAE) says on its website that it is aware there is likely to be an increase in AI demand, “with potentially significant implications” for AI-related energy use in the near future. The IAE has predicted demand for AI is “likely to outpace” improvements in energy efficiency, and while some AI tools could help reduce energy use, “the rapid and mainstream adoption of AI chatbots like OpenAI’s ChatGPT and Google Bard are likely to accelerate growth in energy demand for AI.”
Newsweek reached out to the IAE by email on Tuesday for comment.