latest-news-headlines Market Intelligence /marketintelligence/en/news-insights/latest-news-headlines/ai-sector-touts-dramatic-efficiency-gains-but-power-demand-spike-still-expected-82770646 content esgSubNav
In This List

AI sector touts dramatic efficiency gains but power demand spike still expected

Podcast

MediaTalk | Season 2 | Ep. 29 - Streaming Services, Linear Networks Kick Off 2024/25 NFL Showdown

Podcast

MediaTalk | Season 2 | Ep. 27 - College Football Preview & Venu Injunction

Podcast

Next in Tech | Ep. 181: Lighting up Fiber

Podcast

MediaTalk | Season 2 | Ep. 26 - Premier League Kicks Off


AI sector touts dramatic efficiency gains but power demand spike still expected

SNL Image

NVIDIA Corp. founder and CEO Jensen Huang speaks at the company's annual artificial intelligence conference in San Jose, Calif., on March 18.
Source: Justin Sullivan/Staff/Getty Images News via Getty Images.

NVIDIA Corp. is pushing back against the narrative that artificial intelligence is a massive power drain, asserting that just a few years of innovation is poised to dramatically cut generative AI's energy use. However, AI's rising power consumption overall is not expected to reverse, company executives warned.

Earlier this year, the world's largest accelerated computing company unveiled new hardware that can run AI chatbots using up to 25 times less electricity than its current system, according to NVIDIA. The company expects to launch the Blackwell graphics processing unit (GPU) in 2025, promising exponential efficiency gains for datacenters.

Nevertheless, government agencies and independent analysts are projecting a marked increase in the annual power consumption of datacenters, propelled by the rise of generative AI. Meanwhile, US electric utilities are rapidly expanding their generation portfolios to meet datacenter demand.

This year, US datacenters are projected to consume 372 TWh of electricity, nearly twice as much power as they consumed in 2019, according to S&P Global Market Intelligence 451 Research. In five years, that figure is expected to double again.

Many of these large language models, such as Open AI's ChatGPT, rely on NVIDIA's accelerated computing technology for their speedy response times. Accelerated computing works by running multiple workloads simultaneously, requiring much higher horsepower than traditional computing.

But on a task-for-task basis, accelerated computing requires several orders of magnitude less electricity than its predecessor, NVIDIA believes. From this viewpoint, AI is more energy-efficient than its alternatives, according to Josh Parker, senior director of the tech company's corporate sustainability program.

"If you use ChatGPT to write a paper on a topic, it's going to do it in 15 seconds," Parker said. "If you were to sit at your computer and type that paper out over the course of an hour or so, it would use a lot more energy than you're using with ChatGPT."

Such assertions come as the electric utility industry presses the tech sector for clues on AI's growth trajectory to help inform load forecasts. Some consumer advocates — and even datacenter owners such as Microsoft Corp. — have argued that utilities are overestimating these figures to justify new generation.

"The input of this new industry requires your output. What goes into this new intelligence generator is the output of your AC generator," Jensen Huang, NVIDIA's CEO, told a room full of investor-owned utility executives at the Edison Electric Institute's conference in June. "This is going to be the next driver of fairly significant energy consumption around the world."

SNL Image

The Jevons paradox

In the late 1700s, James Watt invented a steam engine that required half as much coal as the model it replaced, lowering the cost of the technology and encouraging more trades to swap manpower for coal-fired power. After that, UK coal consumption skyrocketed. The events led English economist William Stanley Jevons to warn in 1865 that fuel efficiency innovation was depleting the UK's coal resources.

Today, economists use "the Jevons paradox" to explain how more efficient technologies can sometimes increase demand for a resource. The effect may also explain why accelerated computing is making datacenters more efficient, and yet, datacenters' power demand is rising.

"So that's what makes this so difficult, coming up with a number," Parker said. "And that's why you see forecasts for energy consumption have such a wide variability between them."

The puzzle gets even more complicated for grid planners when considering the potential impacts of AI on other industries' electricity use. For example, NVIDIA says AI can help a factory optimize for electricity by building a "digital twin," or computer model, to test different scenarios using constantly updated data.

AI enthusiasts also tout the technology's potential to improve the power sector's performance. Later this year, Siemens Energy AG plans to start selling software called Omnivise that uses AI to help operators predict and respond to weather- and market-related events.

"Often AI is just brought as the tool. This is the hammer, and this is what we are going to use," Katie Hanley, global head of Omnivise Performance at Siemens Energy, said in an interview. "But we always must start with the problem. What is the problem? Is AI the right application for this?"

Hanley said tools like Omnivise can help power plants minimize their fuel consumption or help a solar-powered electrolyzer optimize its assets, to name a few examples. But even while advocating for AI's use in the power sector, Hanley said that on a broader level, the technology must be weighed against its trade-offs.

"NVIDIA is rolling out this GPU that's supposed to be 25 times more efficient. But as chips get more efficient, do we actually decrease our needs, or do we just fill that with something else?" Hanley asked. "Because, OK, if a datacenter gets more efficient because they're using this 25-times more efficient processing unit, they're just going to add capacity to the datacenter."

NVIDIA's carbon footprint

Parker joined NVIDIA in 2023 to lead the company's environmental, social and governance program, which existed at a much smaller scale at the time. Since then, NVIDIA has surpassed Microsoft as the world's most valuable public company.

By the end of this year, NVIDIA expects to meet its goal of powering its operations with 100% renewable electricity, Parker said. The target does not encompass the emissions of the datacenters that use NVIDIA's products, though many of those companies have their own climate targets.

Some of these consumers, including Microsoft and Google, have recently recorded increases in their emissions due to their investments in AI and growing datacenter fleets.

"ChatGPT kind of came out of nowhere two years ago. And the companies, the economy are still reacting to that," Parker said. "But you would expect we're in a period of churn because it's such a rapid deployment and a huge inflection point in technology that we're living through."

Utilities should still expect AI's load to increase, Parker added. But soon, that increase will be easier to forecast as industries pinpoint where the technology is best applied.

"Because we're at this inflection point with this fourth industrial revolution here, you expect there to be a lot of exploration that maybe isn't efficient," Parker said. "We're not hyperfocused in the first year of a new technology on efficiency; we're focused on, 'Let's see what it can do. Let's test the boundaries, see where it's useful, see where it's not.'"

'There should be concern'

Over the past year, the tech sector has stepped up efforts to address concerns over AI and datacenter power demand, according to Boris Gamazaychikov, senior manager of emissions reduction at Salesforce Inc., which specializes in cloud-based sales software.

"There should be concern," Gamazaychikov said in an interview. "This is definitely a situation, if done wrong, that could have negative impacts long term." In some regions, aging fossil fuel-fired power plants are operating longer than anticipated or coming out of retirement to serve datacenter demand, he noted.

A recent Salesforce survey of roughly 500 sustainability professionals found widespread anxiety over AI energy demand. Almost 40% are worried that AI could negatively impact their enterprise's environmental efforts. However, 57% also feel optimistic about balancing AI's benefits with its impacts, and 65% say AI has transformed their sustainability initiatives with energy efficiency gains, carbon emission modeling, regulatory compliance and other uses.

Gamazaychikov is among the optimistic.

"The work we've done at Salesforce shows there is a huge difference in sustainable AI versus not," he said. "AI is not one thing, it's not monolithic. There's a wide spectrum of efficiency and a lot of things that can be done to prevent those worst predictions from coming true."