We're Crawling Toward an AI Truth
Dear subscriber,
Markets are an unstoppable march toward truth, all telegraphed through the mechanism of price.
That's what I love about them.
Right now, the market is nearing a truth about artificial intelligence ("AI")... It's figuring out just who will actually get rich from this technology.
When AI exploded with the launch of OpenAI's ChatGPT in November 2022, the tech world lost its mind with excitement. It looked like the company's AI breakthroughs would lead to billions of dollars in revenue and wealth.
But since the entire concept was so new, it was hard to understand exactly what the products would be... and what parts of the "value chain" would actually benefit.
It doesn't seem like there will be much value in the AI models themselves.
Take large language models ("LLMs"), like ChatGPT. These tools provide (mostly) coherent answers to almost any question you could ask. They do things that you'd have deemed impossible before you saw it with your own eyes.
But there's little secret to it... Google's Gemini, Anthropic's Claude, X's Grok, and other LLMs do the same exact things with just about the same level of intelligence, depending on what metric you use to compare them.
There's also Meta Platforms' (META) recently released Llama 3.1. It matches or beats OpenAI's models on most measures. And Meta's entire model is free, open-source, and available for download.
With all the competition in a commodity product, it doesn't look like the money will be made by making LLMs.
It also doesn't look like chatbots will be the true use of AI... I'd be surprised if pleading with a computer, via written commands, to cajole it into completing some tasks for you – while also dealing with plenty of wrong answers – is the best use of this advanced technology.
Indeed, earlier this week, Morgan Stanley reported that a chief information officer from an unnamed pharmaceutical company canceled a contract to use Microsoft's (MSFT) AI tool, Copilot.
With Copilot, you can search for information, generate e-mails and summaries, and even create images and presentations. The pharma company's executive said the price to use the AI feature is double the normal Microsoft 365 cost and that slides it generated were akin to "middle school presentations."
Other AI applications in the real world seem frivolous...
Travel-booking company Expedia (EXPE) is touting a chatbot that will help you find out where you want to go on vacation.
I don't know about you... but I already have a whole list of places I want to travel. (Perhaps if AI leads to true productivity gains, I'll get a chance to go there.)
So look for riches elsewhere in the value chain.
Most investors have already figured this part out... by investing in semiconductor giants like Nvidia (NVDA) and other AI-technology plays.
They understand that AI requires massive amounts of computing power just to run – no matter what form this tech eventually takes. That means more spending on power plants, data centers, semiconductor chips, and just about everything else that drives AI.
And it's this spending that we're getting insights into this week...
In particular, earnings from Microsoft and Meta – announced on July 30 and 31, respectively – are helping us understand the true value of AI.
Microsoft's earnings were just fine, but the stock still fell.
Revenue growth in its closely watched Azure cloud division came in at 29%. But that's slower than the 31% growth in the previous quarter.
More shocking, though, is Microsoft's capital expenditures ("capex"). Capex totaled $19 billion, and nearly all of that went to cloud- and AI-related investments. About half of that was used for building and leasing data centers, with the remainder used to buy the servers, central processing units ("CPUs"), and graphics processing units ("GPUs") to fill them.
Yes, Microsoft earns a lot of money. But this is a huge ramp-up in spending...
The next day, Meta announced that it deployed $8.5 billion in capex over the quarter for investments in servers, data centers, and network infrastructure. It expects capex to total $37 billion to $40 billion for the year... which would be up from $28 billion in 2023.
Microsoft's earnings announcement sent its stock down a bit, while Meta's announcement sent shares up a bit.
However, both reports led to a surge in Nvidia's stock... which finished up more than 12% on July 31.
That's because those huge expenses for the cloud providers turn into revenues for Nvidia.
The Big Tech companies don't want to ease off spending. They believe that if they cede the ground to their competitors now, they'll miss out on AI.
So they're pursuing AI dominance with religious fervor. Obsolescence means death in tech.
Investors, on the other hand, are getting antsy. They aren't seeing any kind of revenue from AI... nor are they seeing any real AI products.
At a certain point, the AI spending has to produce results or even the mega-profitable mega-caps will start feeling pressure from investors.
Meta's own pursuit of the metaverse is a perfect example of this...
The company thought an online world was the future, spent billions trying to build it, and only reversed course when investors punished its stock.
I'd bet investors give Big Tech two or three more quarters to generate real business results from AI before they start pressuring companies to crank off the spending spigot.
It's taking a while, but the market will eventually find out what AI is really worth.
What Our Experts Are Reading and Sharing...
If Expedia's AI travel bot sounds useless, a proposed AI tool written about in the MIT Technology Review is just scary. A bioethicist from the National Institutes of Health is working on an AI tool that reads your medical data, personal messages, and social media... and creates your "digital psychological twin" to help family members make end-of-life decisions if you're incapacitated. The tool is yet to be made, and rolling it out would take extreme care and ethical consideration, but the thought is a bit chilling.
This Bloomberg Odd Lots podcast has a great interview on how AI chips work... and how someone could possibly rise to compete with Nvidia. The interview features Reiner Pope and Mike Gunter, founders of a chip startup called MatX – which aims to build the "ultimate chip." They say it'll take three to five years for the company to launch its own chips. And the details are just the right level of wonky. You can find the interview on Apple Podcasts and Spotify, too.
If you want to see how Silicon Valley justifies these tens of billions of dollars in capex, read The Innovator's Dilemma by Clayton M. Christensen. Everyone in the Valley has. The lesson of the book is that every tech company becomes obsolete... and you can't afford to miss the thing that puts you out of business. Here's a good synopsis.
New Research in the Stansberry Investor Suite...
Back to those billions being spent on data centers...
Did you know that a typical data center consumes 10 to 50 times as much energy as an ordinary office building?
The money Big Tech firms are spending on data centers pales in comparison to the cost of supplying them with power over the years they'll operate. Projections show that over the next six to seven years, new power usage from data centers will be equivalent to the power consumption of 17 Seattles.
Stansberry Investor Suite subscribers have a great piece of AI-power-related research waiting for them...
In this month's issue of Stansberry's Investment Advisory, editor Whitney Tilson and his team reveal an energy play that could drive one company's free cash flow much higher in the next few years.
Thanks to an exciting decision made by management, this company stands to see major upside – in its business and its stock – as power needs soar.
What's more... this research relies on the long history of Stansberry Research. This company operates in a field that was underheard of a decade ago – except by Stansberry Research readers.
Now, it's the technology that's likely powering the lights over your head.
Our Investment Advisory team knows this industry like no one else. Investor Suite subscribers can read their new issue here.
Until next week,
Matt Weinschenk
Director of Research