Published: November 14, 2024
AI has the potential to enhance companies’ energy efficiency and resource use, and in some cases, improve climate risk management. Anecdotal evidence of these use cases will need to become the norm in carbon-intensive industries to offset AI’s impact on greenhouse gas emissions.
The global electricity demand of AI-relevant datacenters is expected to double by 2029.
While major datacenter operators are sourcing much of their energy from renewables, the need for continuous power — and more of it, driven by AI usage — means it will be difficult to phase out fossil fuel-based generation completely.
The persistent use of fossil fuel power generation to meet datacenter demand will make reaching net-zero goals more difficult for the tech sector and electric utilities.
As artificial intelligence (AI) proliferates across economic sectors, the business world is weighing the costs and benefits for climate change and the energy transition. While AI technology demands significant amounts of electricity that cannot be entirely supplied by zero-carbon sources, it also has incredible promise for unlocking energy efficiencies across the economy.
In this research, we seek to shed light on the complexities of embracing AI and the datacenter infrastructure it relies on. S&P Global data shows that emissions from purchased electricity have risen sharply at major data processing firms, complicating these companies’ efforts to reach net-zero. Datacenter operators have contracted massive amounts of renewable energy years into the future, but their need for constant, reliable power means natural gas will continue to be a key energy source.
At the same time, initial use cases for AI point to significant efficiency gains, both within datacenters themselves and potentially in the emissions-heavy industries that need to decarbonize rapidly to keep the world on track to limiting global warming. The key question facing AI adoption over the next several years will be how to ensure it unlocks enough climate benefit — in climate risk and scenario analysis, emissions reductions and other uses — to offset its own rising emissions and become net positive.
Datacenters are the backbone of the digital economy, and GenAI represents yet another transformative use for these facilities. Datacenter operators need to power the thousands of processors that provide the computing power behind AI and the cloud while also running industrial-scale air conditioning and other cooling systems. Cooling represents about 40% of a datacenter’s energy use, according to McKinsey. Building a datacenter in an area where the local climate and average temperatures are lower can help offset the amount of cooling needed, but this is only one of several important angles a datacenter owner typically considers during development.
The advent of AI as a major workload for datacenters is a major driver for energy demand. AI workloads are more energy intensive: A single ChatGPT query consumes 2.9 watt-hours, or roughly 10 times the electricity of a traditional Google search, according to a 2024 white paper by the energy research organization EPRI. Daily visits to ChatGPT surpassed 100 million in May 2024, according to the web traffic analysis firm SimilarWeb. While that pales in comparison to Google’s search engine volume — 6.3 million searches per minute as of December 2023, according to Statista — Google has added generative AI to its search function, raising its energy consumption.
Key terms and definitions |
|
Datacenter |
A facility that houses IT infrastructure for building, running and delivering digital applications and services, as well as storing and managing associated data. |
Hyperscaler |
The tier of large datacenter operators that provide cloud computing and data management services at a large enough scale to support technologies such as AI, internet of things, machine learning and big data analytics. Hyperscalers include Amazon Web Services, Google, Microsoft and Meta, among others. |
Leased datacenter |
A datacenter in which capacity may be leased to multiple customers in one facility. Large enterprises such as internet application providers, large enterprises with significant IT requirements, social media companies and hyperscalers may lease capacity from these facilities. |
Generative AI (GenAI) |
AI models capable of generating new content, including text, images, video, code, audio, and synthetic data. GenAI differs from traditional AI, which is capable of structured data analysis or forecasting, classification and process automation, among other uses. |
Measuring the energy use and emissions profiles of two categories of datacenters — those owned directly by the hyperscalers and those that are leased by these major tech firms and others — allows us to estimate the climate impact of AI workloads.
Projections from S&P Global’s 451 Research show that hyperscaler and leased datacenter power demand doubled from 2020 to 2024. This demand is set to grow even faster (137%) through 2029 as AI-driven computing grows, according to 451 Research data. Historically, most of this growth has occurred in North America and Asia-Pacific, and these regions are expected to account for the bulk of future growth given that many of the major players in cloud and AI computing are based in these locales. Globally, the net new power demand for AI-relevant datacenters would be 716 TWh between 2024 and 2029, according to 451 Research.
How does this electricity demand translate into climate impact?
In recent years, greenhouse gas (GHG) emissions from companies that operate or lease datacenters have been rising. Scope 2 indirect emissions, or emissions primarily derived from purchased energy, increased 48% from 2021 to 2023 at companies involved in data processing, according to S&P Global Trucost data. Scope 1 emissions, which are emitted directly by a company’s own operations and sites, have remained relatively stable over that time.
Looking forward, recent research from S&P Global Ratings estimates that US datacenters could generate an additional 40 million to 67 million metric tons of CO2 in 2030, about double their current emissions and equivalent to as much as 4% of 2023 US power-related emissions.
This figure would likely vary significantly by country due to differences in energy mix. In China, for example — the second-largest country in datacenter development after the US — 71% of domestic energy production is still based on coal, according to the International Energy Agency. With datacenter power demand expected to more than double in the Asia-Pacific region by 2029, this additional power will likely be more carbon intensive than in the US.
Using the high end of projected increase in US datacenter emissions as a test case, however, the efficiency gains from AI across sectors would need to result in emissions reductions of 67 million metric tons to offset AI’s own emissions growth in the US. That’s equivalent to about 7% of the 963 million metric tons of CO2 emissions generated by the US industrial sector in 2023, according to the US Energy Information Administration.
Industrial efficiency is one of the areas where AI optimists see the technology having the greatest benefit: Reducing the amount of electricity needed for operations across energy-intensive industrial and manufacturing processes could help some industries lower their emissions. Smart manufacturing that uses AI to find efficiencies in factory processes can cut energy consumption, waste and carbon emissions as much as 30% to 50% compared with traditional processes, according to a 2023 study published in the journal Environmental Chemistry Letters.
AI is also used to perform analysis in a technique known as ‘digital twin analysis,’ in which vast amounts of real-time data on energy consumption and resource use are applied to a digital replica of a physical asset such as a building. An AI model can analyze this data to create more accurate and impactful emission reduction strategies or generate emissions reports to ease the data collection requirements of climate reporting frameworks. The furniture retailer Ikea used a digital twin to analyze its retail stores’ air conditioning systems and was able to reduce HVAC energy consumption by 30%, according to World Economic Forum research.
Using AI for energy efficiency gains may be one of the keys to limiting datacenters’ impact on climate change. In 2016, Google found that applying its machine learning technology to its datacenters reduced their energy needs for cooling by 40%. Chinese telecommunications giant Huawei, another major datacenter operator, said in 2020 that it expects its datacenter heat management system developed using machine learning to reduce the energy needs of just one datacenter by 6 million kilowatt-hours per year, according to the industry publication Intelligent CIO.
While these are promising examples and there is optimism for further adoption, AI is not yet widely used for sustainability in the corporate world. The 2024 cycle of the S&P Global Corporate Sustainability Assessment (CSA) included a voluntary topic on the use of AI to improve performance on various aspects of a company’s environmental, social and governance performance. The CSA is an annual evaluation of corporate performance on a range of sustainability topics.
About 4% of assessed companies, or 272 out of 6,351, responded to this question. Of these AI early adopters, 38% said they are using AI to improve energy consumption, 25% said they use it to improve their climate performance and 24% said they use it to develop sustainable products and services. In addition, 13% of these companies are using AI to improve water management, and 8% are using AI to improve their biodiversity performance in some way.
Descriptions of some of these use cases provided to the CSA include:
There is also optimism that AI models themselves will become significantly more energy efficient over time — in part by using AI to find efficiencies in datacenters. Datacenter power usage effectiveness (PUE), a measure of energy efficiency, has improved significantly over the past 15 years, but improvements have been slow since 2018, according to S&P Global Ratings. Hyperscale and cloud-focused datacenters are already among the most efficient.
Aside from efficiency gains, many tech companies are seeking alternative energy sources for their datacenters, such as renewables like wind and solar and other carbon-free sources like nuclear. About 62% of energy used in datacenters comes from renewable sources, a figure that has risen sharply in the last four years, according to data collected in the 2024 CSA.
Looking forward, operators are laying claim to future clean energy supplies where they can. Power purchase agreements made between tech companies and electric utilities show roughly 47 gigawatts of carbon-free generation lined up by 2030, according to S&P Global Commodity Insights.
However, these agreements represent only some of the energy that datacenters underpinning AI workloads will require. Total demand for hyperscaler-owned and leased datacenters could reach 528 terawatt-hours in North America alone by 2029. While the energy mix of electric utilities in North America is evolving away from fossil fuels and toward renewables, recent research from S&P Global Ratings estimates that datacenter demand growth will require billions of cubic feet of additional natural gas in the US by 2030. That’s because datacenters require uninterrupted, consistent power, and renewables like wind and solar generate electricity based on variable weather factors like wind speed and cloud cover. Matching constant electricity demand to variable power generation from renewables makes the goal of a zero-carbon datacenter difficult to achieve.
Nuclear power may be one solution to the challenge of time-matched power supply and demand. In September 2024, Microsoft and Pennsylvania electric utility Constellation reached an agreement on purchasing power from restarting one of the reactors at the Three Mile Island nuclear facility. The reactor will add about 835 megawatts of energy to the grid, according to the companies. In March 2024, Amazon Web Services acquired a 960 megawatt datacenter directly powered by an adjacent nuclear plant. While using existing nuclear power could help address operators’ need for clean power unrestricted by weather conditions, new large-scale nuclear facilities can take many years to plan, permit and build — in the US, up to five years for permitting and five years or more for construction, according to the US Energy Information Administration. A new generation of nuclear reactors called small modular reactors (SMRs) intended to be smaller, more flexible and more easily deployed could be another solution. SMRs typically have a capacity of around 300 megawatts per unit and could be well-suited to individual datacenter sites. SMRs are not expected to be commercially available in the near term, however. In the US, the first SMR with a design certified by the US Nuclear Regulatory Commission is expected to be fully operational by 2030.
At a higher level, nuclear is expected to make up a shrinking share of the US energy mix over the next three decades, according to the S&P Global Commodity Insights May 2024 Planning Case. Fossil fuel generation is expected to fall sharply over that time, replaced by wind and solar. As a share of the Lower 48 US states’ energy mix, nuclear is projected to decrease from 20% to 13% from 2020 to 2050.
How does the reality of power demand for AI impact progress on climate change?
The need for more energy to power the digital economy, including AI workloads, has the potential to put tech firms and electric utilities at odds with the scale and speed of decarbonization required by the Paris Agreement on climate change. Hitting the Paris Agreement’s goal of limiting global warming to “well below” 2 degrees C by century-end will mean cutting global GHG emissions by 42% by 2030, according to the UNEP Emissions Gap Report 2024. Those emissions reductions are a steppingstone on the way to hitting net-zero, or cutting emissions as close to zero as possible and storing or offsetting the remainder, by 2050.
Corporate commitments to reaching net-zero are still the exception, not the rule, for most of the largest US companies. And while many major tech firms have made these commitments — including companies leading the AI charge like Microsoft, Alphabet and Meta — net-zero commitments are rare across the tech industries that operate or use datacenters. Data from the 2023 CSA shows that only 15% of software companies and 9% of companies in the interactive media, services and home entertainment industry have made a net-zero pledge.
Net-zero targets are more common among electric utilities, but achieving net-zero while also satisfying the higher electricity demand expected across different economic sectors will be a challenge. CSA data also shows that while 45% of assessed utilities have made net-zero pledges, only 14% of these companies that committed to net-zero also have specific long-term emissions reduction targets. That could be a sign that utilities are making commitments to reaching net-zero without mapping out how to get there by reducing their emissions. This gap may also be a signal that utilities with net-zero plans could rely more on carbon capture or carbon credits to offset the GHGs they produce rather than on cutting the emissions they produce in the first place.
While AI workloads have the potential to drive up GHG emissions, they can also contribute to various sustainability goals. Companies across sectors are using AI to improve their energy and resource efficiency, improve their risk management processes — including on climate — and help collect and report sustainability data. There is significant promise in using AI to achieve new efficiencies in carbon-intensive industries, with the ultimate goal of lowering these industries’ emissions.
Companies relying on AI to help solve their decarbonization challenges will need to be cautious, however, of the possibility that higher efficiency results in what economists call the rebound effect — that, counterintuitively, a more efficient technology can boost energy usage instead of reduce it. This effect has roots in the Industrial Revolution in the 19th century, when economist William Stanley Jevons found that as technological improvements made coal use more efficient, coal consumption rose. Other cases of this effect can be found in highway design, in which adding more lanes ultimately increases traffic congestion. In that vein, a cement manufacturing plant that uses AI to produce the same amount of cement with 20% less energy might choose to use those gains to produce even more, rather than bank the efficiency gains in the form of lower emissions. Resisting this impulse will be central to ensuring that AI becomes a net positive climate solution.
This piece was published by S&P Global Sustainable1 and not by S&P Global Ratings, which is a separately managed division of S&P Global.