Soon after Chinese AI company DeepSeek rattled the stock market as investors saw its novel, affordable artificial intelligence models as an ominous sign for US tech stocks, a wave of Big Tech companies reported quarterly earnings in new weeks— including six of the so-called Beautiful 7 firms.
The US tech industry was so shaken by DeepSeek’s innovations that AI darling ( ) saw$ 600 billion cut off its market cap in a single day — the largest loss in stock market history— as investors worried that Big Tech would slow down its investment in AI hardware.
Then, some US lawmakers are pushing to boycott the software from government-owned equipment, according to the Wall Street Journal.
On earnings calls with owners, managers across these companies were quick to praise DeepSeek’s artificial intelligence designs, dismissed them, or attempted to avoid the topic entirely.
The sell-off, which affected the majority of US tech companies, had a range of depressing to ecstatic emotions. Despite the lack of clarity about when the reward for those investments may be, most people agreed that the DeepSeek information was a signal that AI costs will gradually fall, but they reaffirmed their commitments to spending significant sums on money expenditures and additional investments for AI facilities in 2025.
Here’s what they said.
Microsoft ( ) CEO Satya Nadella was quick to embrace DeepSeek, mentioning the firm in his opening remarks on a post-earnings call on Jan. 29.
Nadella pointed to Microsoft’s move to put DeepSeek’s latest AI model on its developer platforms, Azure AI Foundry and Git Hub, adding that it went through “automated red teaming, content safety integration, and security scanning”. He said consumers will soon be able to run DeepSeek’s types directly on Microsoft’s AI PCs.
” I think DeepSeek has had some real innovations”, Nadella said, adding that he sees AI getting” commoditized”.
” For a hyperscaler like us, a PC software company like us, this is all good news as far as I’m involved”.
Meta ( ), which one tech analyst recently described as” ” given its advertising business, saw its stock increase on DeepSeek’s debut of its new AI model called R1, with shares rising nearly 2 % on the day of the news.
CEO Mark Zuckerberg was unconcerned about the DeepSeek craze.
When questioned about whether the potential for lower-cost AI designs may change Meta’s capital expenditures, the chief executive responded in a call following the company’s most recent quarterly earnings, saying,” I don’t know — it’s probably too early to really have a strong opinion on what this means for the path around infrastructure and expenditure and things like that.”
However, DeepSeek is also in competition with Meta, which has attempted to make its open-source Llama AI models the accepted standard. DeepSeek’s models are also open source.
” I also just think in light of some of the recent news, the new competitor, DeepSeek from China… it’s one of the things that we’re talking about is there’s going to be an open-source standard globally”, Zuckerberg said.
” And I think for our kind of own national advantage, it’s important that it’s an American standard”.
Asked for his “worldly perspective” on” the DeepSeek situation”, Apple ( ) CEO Tim Cook said in a post-earnings call Jan. 30 that “innovation that drives efficiency is a good thing” and noted that the iPhone maker takes a “very prudent and deliberate approach to our expenditure”.
The same day that DeepSeek released its most recent R1 model, Apple shares increased by more than 3 %.
When asked about what DeepSeek’s low-cost model means for Google ( ), CEO Sundar Pichai said that while DeepSeek’s team has “done very, very good work”, he thinks Google’s Gemini Flash models are better.
” I would say both our 2.0 Flash models, our 2.0 Flash thinking models, they are some of the most efficient models out there, including comparing to DeepSeek’s V3 and R1″.
” And I think a lot of it is our strength of the full-stack development]Google makes its own custom AI chips as well as AI models and the software that runs on them], end-to-end optimization, our obsession with cost per query”, Pichai added.
” I think part of the reason we are so excited about the AI opportunity is because we know we can drive extraordinary use cases because the cost of using it consistently falls, making more use cases practicable,” he said.
Amazon ( ) CEO Andy Jassy said he thinks that DeepSeek’s models won’t spark a downturn in AI investment.
” Sometimes people make the assumptions that if you’re able to decrease the cost of any type of technology component, in this case, we’re really talking about inference]running AI models], that somehow it’s going to lead to less total spend in technology. And that is never the case, in our opinion.
Jassy cited the company’s aggressive spending on building its cloud infrastructure in the early 2000s, even as costs decreased.
He’s right, at least for now.
Meta, Alphabet, Amazon, and Microsoft said in their earnings calls that, despite the anticipation that artificial intelligence training and inference costs will come down, they will spend a cumulative$ 325 billion in 2025, a 46 % increase from the prior year. The group spends the most money on Amazon.
Investors weren’t pleased: Amazon’s stock dropped 4 % Friday following executives ‘ commentary that they expect to hike capital expenditures by 35 % to more than$ 100 billion.
According to ( ) CEO Lisa Su, who is convinced that new innovations like DeepSeek’s models won’t lower AI investment, citing the recently announced$ 500 billion Stargate AI infrastructure project supported by SoftBank ( ), Oracle ( ), and OpenAI.
” All of these initiatives require a lot of new compute power and offer AMD unmatched growth opportunities across our businesses.”
Qualcomm () CEO Cristiano Amon argued that these developments could shorten the time it takes for AI to become more popular, leading to a new wave of demand for smartphones and PCs, making the case that Qualcomm’s CEO was perhaps most excited about DeepSeek’s innovations.
” DeepSeek-R1 and other similar models recently demonstrated the AI models are developing faster, becoming smaller, more capable and efficient and now able to run directly on device”, Amon said.
” As we enter the era of AI inference, we expect that while training will continue in the cloud, inference will run increasingly on device, making AI more accessible, customizable, and efficient. This will encourage the development of more targeted, purpose-oriented models and applications, which we anticipate will drive increased adoption, and, in turn, demand for Qualcomm platforms across a range of devices”.
Rene Haas, the CEO of Arm ( ), also argued that the UK-based chip architect could use less expensive AI for consumer devices, even criticizing Nvidia.
” As wonderful a product as]Nvidia’s ] Grace Blackwell is, you’d never be able to put it in a cell phone, you’d never be able to put it into earbuds. You can’t even put it into a car. But Arm is in all those places”.
For Yahoo Finance, Laura Bratton is a reporter. Follow her on Bluesky @laurabratton. bsky. social. Email her at laura. bratton@yahooinc.com.