A new study forecasts that the energy consumption of the AI industry might equal that of the Netherlands by 2027.
However, if the growth of AI slows, its environmental impact could be “less than feared,” according to the research.
Alex De Vries, a PhD candidate at the VU Amsterdam School of Business and Economics, conducted the study. His projections assume certain conditions, such as the current AI growth rate and chip availability.
Given that Nvidia supplies about 95% of the sector’s required AI hardware, De Vries used Nvidia chip power consumption data as benchmarks.
By assessing Nvidia’s expected deliveries by 2027, he estimated AI’s energy consumption to be between 85-134 terawatt-hours (TWh) annually. At its peak, this consumption parallels that of a small nation.
Many, including the study’s author, consider these findings speculative, but evidence from Microsoft this year indicated the extensive water usage associated with their data centers. A 2019 study found that training some neural networks might release equivalent CO2 to five cars over their entire lifetime.
Microsoft’s and Google’s increasing water usage may correlate with running resource-intensive AI-related workloads, which adds credibility to De Vries’ findings.
“You would be talking about the size of a country like the Netherlands in terms of electricity consumption. You’re talking about half a per cent of our total global electricity consumption,” he told BBC News.
This is additive to the power consumed by data centers, which is already thought to exceed 1% of global electricity consumption, according to the International Energy Agency (IEA).
De Vries further stresses the study’s implications by stating that AI should be used only “where it is really needed.”
The surging demand for AI-centric computers translates into increasing energy and water needs, and while Nvidia is trying to reduce the energy consumption of its chips, there’s only so low you can go with current technology.
Danny Quinn of DataVita highlighted the disparity in energy consumption between conventional and AI server racks, stating, “A standard rack full of normal kit is about 4kWh of power, which is equivalent to a family house. Whereas an AI kit rack would be about 20 times that, so about 8kWh of power.”
Will generative AI ‘cool down’ in 2024?
Increasing energy demands and monetization challenges are causing some to predict that generative AI will face a reality check in 2024.
Ben Wood, chief analyst at CCS Insight, emphasized that while AI promises profound societal and economic impacts, the “overhyped” narrative around generative AI might soon face challenges, especially for smaller developers.
“Just the cost of deploying and sustaining generative AI is immense,” Wood told CNBC.
“And it’s all very well for these massive companies to be doing it. But for many organizations, many developers, it’s just going to become too expensive.”
Wood also notes that Nvidia’s dominance over the AI hardware industry poses potential bottlenecks.
Larger corporations like Amazon, Google, and Meta are now working on AI-specific chips to manage these immense computational needs, which is phenomenally expensive.
For AI to truly fulfill futuristic visions, we must pull more energy and water from somewhere. If humanity can crack nuclear fusion and quantum computing soon, however, then that’s a different story…