Alphabet has replaced Nvidia as the AI market darling; The numbers don't add up for massive AI spending; Mike Burry's bearish views on AI and Nvidia's response
1) It can be overwhelming trying to follow the latest developments in artificial intelligence ("AI") and the bull-bear debate over the leading companies. But it's what's driving the markets and the economy these days, so let's dive in...
This article from yesterday in the Wall Street Journal highlights how one of my favorite stocks, Alphabet (GOOGL), has replaced Nvidia (NVDA) as the market darling:
Worries about the AI trade that buffeted markets recently have weighed particularly heavily on Nvidia, a bellwether of investors' enthusiasm for big tech. At the same time, Alphabet has defied the trend, with investors rewarding the company for both its AI advances and its strong core advertising and search businesses.
Alphabet shares climbed as high as 3.2% in early trading after The Information reported the talks with Meta about using Google chips to run its data centers. Alphabet last year increased production of semiconductors, an effort that potentially reduces reliance on outside vendors.
After tracking each other closely for most of the year, the two stocks have diverged sharply in the past month:
Another related story in yesterday's WSJ shows how Alphabet is "defying fears of an AI bubble" and outperforming the other big tech stocks:
Someone forgot to tell Google about the whole "AI bubble" thing.
In a brutal month for tech stocks – especially those closely associated with the artificial-intelligence race – the internet-search giant has solidly bucked the trend. Parent company Alphabet's stock has jumped around 16% since the Nasdaq peaked on Oct. 29, adding to a run that began in early September when the company won a court ruling that effectively ended worries about a government-imposed breakup.
Meanwhile, Microsoft, Oracle, Nvidia and Meta Platforms have seen double-digit declines since the Nasdaq high...
In the chart above, I'm also glad to see my other favorite Magnificent Seven stock, Amazon (AMZN), not falling like the rest since the Nasdaq Composite Index's peak.
Lastly on the topic, my friend Bryan Lawrence of Oakcliff Capital made a post on social platform X about the Google-Nvidia rivalry. He concluded:
Nvidia's strategy is to support its customers' efforts to raise investor capital until their businesses capture enough of that 99.9% spread to be self-sustaining, and to hope that its own 75% gross margins are sustainable. A loss of investor confidence impairs this strategy, which [is] why the rebuttal was issued.
Google's strategy is to drive costs down across the full stack, to protect Search's advertising business and to generate profits wherever they end up landing. Its founders have said they are willing to bankrupt the company rather than lose in AI.
Investors chasing winner-take-all outcomes across the AI stack may be disappointed. Based on how unit economics look today, much of AI looks like a commodity business in which low cost wins.
Nvidia's and Google's strategies are in obvious conflict. Some investors will make a lot of money and others will lose a lot. More will be revealed as this unfolds. Pass the popcorn.
I continue to recommend Alphabet but not Nvidia. Though, as I've explained in many previous e-mails, if I held it and had big gains, I'd like this winner to run.
2) If you want to better understand the differences between AI chips, I recommend this 16-minute CNBC video:
I don't claim to be an expert on these various chips, but I have enough understanding to feel confident that Google's tensor processing unit ("TPU") chips are winners.
3) A major concern is whether leading companies are spending too much on AI. This Financial Times article questions whether the numbers add up for OpenAI, maker of ChatGPT:
Based on a total cumulative deal value of up to $1.8 [trillion], OpenAI is heading for a data centre rental bill of about $620 [billion] a year – though only a third of the contracted power is expected to be online by the end of this decade...
Squaring the first total off against the second leaves a $207 [billion] funding hole...
Along the same lines, this WSJ article uses Intel (INTC) as a cautionary tale on too much AI spending:
It has become a tech-industry truism: Spending too little on chips and other computing infrastructure for artificial intelligence is riskier than spending too much...
But investors have begun questioning this logic, fretting that the spending spree might be inflating a bubble that will inevitably pop. Indeed, spending too much can be very bad. Just ask Intel...
The result has been nothing short of a disaster. Manufacturing projects have been put on hold or canceled. The company has been bleeding cash with free cash flow in negative territory in all but three of its last 14 quarters.
I think it's highly unlikely that OpenAI will be able to fund its projected growth plans and will have to scale them back. That would impact Nvidia, Oracle (ORCL), and Microsoft (MSFT).
4) My old friend Mike Burry of The Big Short fame has been among the most vocal commentators calling the AI craze a bubble.
He has launched a new Substack page, Cassandra Unchained, and posted two in-depth essays outlining his concerns (a paid subscription is required to read them):
In the first essay, he compares recent AI capital expenditures as a percentage of GDP with past bubbles: the Internet boom of the late 1990s, the housing boom leading up to 2008, and the oil shale revolution from 2012 to 2014:
He concludes:
And once again there is a Cisco at the center of it all, with the picks and shovels for all and the expansive vision to go with it. Its name is Nvidia.
In his second essay, Burry questions whether "today's five public horsemen" – referring to Alphabet, Amazon, Oracle, Microsoft, and Meta Platforms (META) – are properly accounting for their massive AI spending.
In particular, they appear to be extending the claimed useful life of the chips they're buying (mostly from Nvidia), as Burry shows in this table:
He argues:
Extending useful life decreases depreciation expense and increases apparent profits. It is one of the more common frauds of the modern era and results in overvalued assets and overstated profits.
Nvidia is so concerned about Burry's critique that it took the unusual step of releasing a seven-page rebuttal – but only to Wall Street analysts. (You can read it here.)
Burry highlights this paragraph from the rebuttal:
NVIDIA's customers depreciate GPUs over 4-6 years based on real-world longevity and utilization patterns. Older GPUs such as A100s (released in 2020) continue to run at high utilization and generate strong contribution margins, retaining meaningful economic value well beyond the 2-3 years claimed by some commentators.
And he responds, concluding:
The implication that useful life for depreciation is longer because chips from 4-6 years ago are "fully utilized" confuses physical utilization with value creation. Just because a widget is used does not mean the widget is profitable to a degree that it is worth more than residual value...
I estimate every one of these hyperscalers will overstate earnings by double digits, and each one will have tens of billions of overstated assets vulnerable to write down.
As a final note, Burry warns about the risk AI poses to private credit:
Quick PSA: Friends don't let friends' parents buy private credit offerings. Just say no to that wealth manager.
Private credit has exploded post-GFC and is non-bank financing. The lenders are generally private credit funds sponsored by the big private equity firms, and the space has grown like a weed because it is lightly regulated. Not usually a good sign.
The big target for private credit now is AI data centers and anything inside them or in the immediate vicinity.
The big hyperscalers are richer than many countries, so why are they using so much private credit for the development of data centers? That is a very good question.
I'll give you a hint, as it ties into the thesis of this whole post – there is a duration mismatch of catastrophic proportions between the asset and the loan.
I agree with Burry about these risks and the concerns he raises about overspending in the AI sector.
However, I remain bullish on my longtime favorites, Alphabet, Amazon, and Meta. They're incorporating AI into their already fantastic businesses in ways that will make them even more profitable.
Best regards,
Whitney
P.S. I welcome your feedback – send me an e-mail by clicking here.
P.P.S. Our offices will be closed tomorrow and Friday in observance of Thanksgiving, so look for my next daily e-mail on Monday, December 1. Have a wonderful holiday!





