latest-news-headlines Market Intelligence /marketintelligence/en/news-insights/latest-news-headlines/big-tech-ramps-up-capex-as-1st-signs-of-ai-returns-emerge-86140639 content esgSubNav
In This List

Big Tech ramps up capex as 1st signs of AI returns emerge

Blog

The Party is Over: Tupperware’s Failure

Podcast

Private Markets 360 - Episode 17: European Credit Opportunities

Blog

Engineering and Construction Cost Indicator declined in September as cost increases for materials and equipment moderate

Podcast

Next in Tech | Ep. 186: B2B Payments Technology and Markets


Big Tech ramps up capex as 1st signs of AI returns emerge

Big Tech companies significantly boosted their capital expenditures in the third quarter, reflecting a strategic focus on expanding AI infrastructure.

Amazon.com Inc. led its peers in capital expenditures growth with a 44% year over year increase to $22 billion. It was followed by Alphabet Inc., which reported capex growth of 35%; Microsoft Corp. with 34%; and Meta Platforms Inc. at 21%. All told, capex from the four companies totaled $58.86 billion in the third quarter, up from $36.95 billion a year earlier.

SNL Image

Amazon CEO Andy Jassy said during an earnings call that the company is experiencing strong demand for generative AI, prompting investments in its cloud business, Amazon Web Services.

"[GenAI is] growing triple-digit percentages year-over-year, and it's growing 3x faster at its stage of evolution than AWS did itself. We thought AWS grew pretty fast," Jassy said.

Microsoft shared a similar sentiment, saying its AI business is on track to reach $10 billion this year. CEO Satya Nadella said most of this revenue comes from inference use for its products GitHub Copilot and M365 Copilot rather than raw GPU demand. Inference is the stage in which trained AI models are fed inputs like prompts to generate new output. Notably, inference comes after AI training, which is a more computationally intensive process.

Microsoft is among the first Big Tech players to successfully monetize GenAI demand at the application layer, which is when companies combine third-party foundation models with their data or applications to create new use cases or target specific vertical markets.

SNL Image

Alphabet reported a 35% increase in cloud revenue to $11.4 billion, with program requests to its generative AI app, Gemini, increasing fourteenfold. Alphabet did not disclose specific revenue details from its GenAI applications. However, NotebookLM, a free GenAI app enabling users to create podcasts from uploaded documents and notes, has gained popularity, and the company is launching the app for business use.

Meta Platforms, which open-sourced its Llama model for free consumer and business use, does not directly generate revenue from its AI products. However, the company said enhancements in its AI-driven feed and video recommendations have increased user time on Facebook and Instagram. Meta has been one of the largest AI spenders relative to its revenue, despite not seeking direct monetization of the technology. CEO Mark Zuckerberg said the company is training Llama 4 on over 100,000 NVIDIA Corp. H100 AI GPUs.

SNL Image

The GenAI wave has sparked fear of disruption and opportunity for profit among Big Tech. Microsoft was an early collaborator with OpenAI LLC. The Redmond, Washington-based tech juggernaut operates in the infrastructure layer, offering cloud and software services, and the application layer, primarily in enterprise. This early partnership, combined with an established enterprise distribution network, has enabled Microsoft to swiftly roll out GenAI enterprise products and monetize them.

Amazon appears to focus more on providing cloud infrastructure, with its partnership with Anthropic PBC primarily aimed at enhancing consumer products such as Alexa.

Like Microsoft, Alphabet operates in both the infrastructure and application layers, though its applications are unlikely to generate substantial revenue at present. Once a leader in AI, Alphabet missed the initial AI hype, losing early ground to OpenAI and other startups — though it seems to be catching up.

Meta's strategy involves sacrificing short-term profits to establish Llama as a standard foundation model in the emerging GenAI sector.

"Part of why we are leaning so much into open source here in the first place is that we found counterintuitively with Open Compute that by publishing and sharing the architectures and designs that we had for our compute, the industry standardized around it a bit more," Zuckerberg said.