A datacenter in Northern Virginia is just one of many seeking electricity supply from Dominion Energy Inc. |
The rapid push to adopt artificial intelligence and machine learning will have far-reaching impacts that are only just starting to be understood. The electricity market, which is already undergoing dramatic change due to the rapid growth of renewable resources and broad efforts to electrify sectors that have traditionally run on fossil fuels, is now also expecting some significant net demand gains as a result of the new technology.
This article, the first in a series looking at potential impacts of AI adoption across the energy sector, explores some impacts on electricity demand.
The volumes of electricity needed to power new emerging artificial intelligence applications remain unclear, but it appears the technology will lead to a significant net increase in US power consumption from the already growing datacenter market, industry experts told S&P Global Commodity Insights.
"Regarding US power demand, it's really hard to quantify how much demand is needed for things like ChatGPT," David Groarke, managing director at consultant Indigo Advisory Group, said in an interview, referring to the AI-powered language model developed by OpenAI LLC. "In terms of macro numbers, by 2030, AI could account for 3% to 4% of global power demand. [Google LLC] said right now AI is representing 10% to 15% of their power use, or 2.3 TWh annually."
Google, the cloud computing and search giant owned by Alphabet Inc., could significantly increase its power demand if its own generative AI technology were used in every Google search, according to academic research conducted by Alex de Vries, a Ph.D. candidate at the VU Amsterdam School of Business and Economics.
Citing research by semiconductor research and analysis firm SemiAnalysis, de Vries in a commentary published Oct. 10 in journal Joule estimated that using generative AI such as ChatGPT in each Google search would require more than 500,000 of Nvidia Corp.'s A100 HGX servers, totaling 4.1 million graphics processing units (GPUs). At a power demand of 6.5 kW per server, that would result in 80 GWh of energy consumption per day, or 29.2 TWh, annually.
But such widespread adoption with current hardware and software is unlikely, de Vries said in the commentary. That volume of Nvidia servers does not currently exist, and the cost to produce such a number could run up to $100 billion.
"In summary, while the rapid adoption of AI technology could potentially drastically increase the energy consumption of companies such as Google, there are various resource factors that are likely to prevent such worst-case scenarios from materializing," De Vries said in the commentary.
US datacenter trends
Close attention to datacenter geography will be increasingly important for grid operators as AI adoption progresses. Current power demand from operational and planned datacenters in the US is about 30,694 MW, according to an analysis of 451 Research data. Investor-owned utilities are set to supply about two-thirds, or 20,619 MW, of that amount.
Dominion Energy Inc.
Since 2019, 81 datacenters with a combined capacity of 3.5 GW have connected to Dominion's power system, the utility said in a late June presentation to Mid-Atlantic grid operator PJM Interconnection LLC.
"From 2023 to 2030, we are looking at about an 80% increase in US datacenter power demand, going from about 19 GW to about 35 GW," Stephen Oliver, vice president of corporate marketing and investor relations at Navitas Semiconductor, said in an interview.
Initial power demand for training AI is high and is more concentrated than traditional datacenters applications. A typical rack of servers with AI processors consumes 30 kW to 40 kW, or two to three times the consumption of a traditional rack, "so we need new technology in the power converters," Oliver said.
While familiar names such as Google, Amazon Web Services Inc. and Microsoft Corp. operate datacenters, they do not necessarily design and build the hardware, Oliver said, adding, "We need to use new technology without taking up space within the cabinets."
Different types of AI have different energy needs
It is useful to look at AI in two broad slots, Indigo Advisory Group's Groarke said: "Narrow AI" is a little more contained and not that energy intensive, with use cases like load forecasting and predictive maintenance. "Inference usage," like running a prompt that provides an answer, adds to power consumption.
The swath of applications that is really energy intensive is the language learning side, which needs more memory and storage. These are things like neural networks that need thousands of GPUs, he said.
Constance Crozier, assistant professor at Georgia Institute of Technology's H. Milton Stewart School of Industrial and Systems Engineering, said that training something like ChatGPT uses about 1 billion times more power than running a query, but the aggregate power consumed by running queries can become significant over time.
Power demand for AI comes from training these models, which are pulling in "huge amounts of data from the web, and that seems to be doubling every year," Groarke said.
Global data volumes double every few years, but it is challenging to isolate datacenter usage from the total, he said.
"The intensity of the training of the models is using the most power," Groarke said. "We need new ways of creating those models and a lot of this power demand is predicated on how AI is adopted."
AI is not on par with power usage for cryptocurrency, "but we could see that as companies adopt large language models," he said.
Many companies are working at embedding large language models in their own networks. Within the large language model industry there is an effort to reduce complexity and increase the efficiency of hardware. "There is an awareness of not just building these models for the sake of it," Groarke said.
There are some AI applications that are learning to control systems in ways that will reduce power demand, Crozier said, adding that Google's DeepMind is going to trial with being able to control room temperatures, such as at its own datacenters.
"There are a lot of efficiency gains to be made in building energy efficiency," she said. "I have seen academic literature for managing datacenters and being smarter with how to allocate load to certain servers."
There are control problems that AI could be used to improve in the future, Crozier said, adding that there are also some non-AI methods that could do the same thing, but they are more complicated because they need more extensive modeling.
"There is interest in this area because you can start something, train it, and see if it can better control buildings," Crozier said.
Virtual power plants are more about shifting power demand as opposed to reducing it. Electric vehicle charging is another example of where demand could be shifted to times when there is a surplus of renewable-generated electricity.
This is also true of AI training algorithms, which can be paused. "I could imagine a situation where we train these algorithms at times when the grid has fewer constraints, although strong economic incentives would be necessary," Crozier said.
S&P Global Commodity Insights reporter Jared Anderson produces content for distribution on Platts Connect. S&P Global Commodity Insights is a division of S&P Global Inc.
451 Research is part of S&P Global Market Intelligence. S&P Global Market Intelligence is a division of S&P Global Inc.
S&P Global Commodity Insights produces content for distribution on S&P Capital IQ Pro.