New bombshell about Tesla's Autopilot system; Debate over Autopilot; Negative headlines can be a 'contra-indicator' for Tesla's stock
1) Regular readers know I always keep an eye on developments at Tesla (TSLA) and what they mean for the stock...
Whatever your opinion is on the stock, there's no question that Tesla is one of the most important companies in the world. And love him or hate him (I seem to be one of the few people in between) − the company's CEO Elon Musk is one of the most fascinating characters of all time.
The stock fell 4% yesterday – no doubt in part due to this bombshell 11-minute video the Wall Street Journal posted on Monday: The Hidden Autopilot Data That Reveals Why Teslas Crash. (I'll note that you need to be a subscriber to access it, but I'll give a summary below.)
The WSJ looked at more than 1,000 crashes that Tesla submitted to regulators (as required) and focused on 222. It found 44 of them occurred when Teslas in Autopilot self-driving mode veered suddenly and 31 occurred when Teslas in Autopilot system failed to stop or yield to an obstacle in front of it.
Then the WSJ did what nobody (including regulators) has done before...
Thanks to a hacker breaking through Tesla's robust encryption (the WSJ posted a video showing the process – it's very cool!), the WSJ was able to access the data the cars' onboard computers collected, which revealed what the cameras detected (and failed to detect) and what Autopilot did (and didn't) do.
Of course, Tesla has always had this data but has refused to release it, claiming it was proprietary.
The WSJ's analysis revealed deep flaws in the camera system – which Tesla relies on 100%, as it doesn't use light detection and ranging ("LiDAR") technology – and Autopilot, which led to dozens of accidents... many of them fatal.
Meanwhile, by coincidence, Bloomberg had a similar story on Monday: Tesla Analyst Nearly Crashes While Using 'Full Self-Driving'. Excerpt:
Elon Musk has said during Tesla's last two earnings calls that investors won't understand the company unless they're using the driver-assistance system marketed as Full Self-Driving ["FSD"].
William Stein, a Truist Securities analyst with a hold rating on Tesla's stock, took this as his cue to test-drive one of the carmaker's vehicles, and narrowly avoided a crash.
"The Model Y accelerated through an intersection as the car in front of us had only partly completed a right turn," Stein wrote in a report to clients Monday. "My quick intervention was absolutely required to avoid an otherwise certain accident."
Stein, who maintained his rating and $215 price target, emerged from the experience "befuddled at what Tesla might show" at an unveiling of robotaxi prototypes in October. Musk said last week that the company decided to delay the event by about two months, confirming an earlier Bloomberg News report.
And the article continues with more details about the test drives:
Both of Stein's drives were around New York suburbs in clear and dry conditions. He was impressed during his latest test how well FSD adapted to lane closures, potholes and traffic flows, and said the driving "felt more natural overall" than the prior trial.
What was surprising and went poorly, he said, was the system's permissiveness – he was no longer required to tug on the steering wheel to keep FSD engaged, and was able to continue using it even while taking his eyes off the road.
"I turned my head completely away from the road," Stein wrote, adding that his son kept a lookout for any danger. "The system continued for 20-40 seconds before issuing a warning"...
Stein concluded that the version of FSD he tested was "truly amazing, but not even close to 'solving' autonomy," alluding to language Musk has repeatedly used.
(Here's another Bloomberg article from back in May about issues with FSD: Tesla's 'Full Self-Driving' Struggled Soon After Leaving My Driveway... and here's another related article from yesterday: Tesla in Seattle-area crash that killed motorcyclist was using self-driving system, authorities say.)
2) After seeing the WSJ video, I wanted to get takes from two people I frequently turn to when it comes to Tesla...
First, I asked my friend Anton Wahlman for his thoughts. Longtime readers will likely recognize his name from previous daily e-mails when I've quoted him – he writes a blog on the automotive industry, focused on electric vehicles, and has long been bearish on Tesla. Here's what Anton sent me in a private text group:
This is a very powerful report by the WSJ... The problem is when [Tesla Autopilot] hurts someone else, whether inside your own car or another person in traffic, an innocent bystander, a bicyclist, etc.
This kind of Russian Roulette should not be allowed to be tested on public roads, where other people have not signed up to be killed by these 5,000-pound projectiles...
And as he continued:
Elon has skirted the long arm of the Department of Justice despite his false Autopilot/FSD promises. When you go to www.tesla.com/autopilot right now, there is a video, filmed in late 2016, that shows a car driving itself from the Safeway in Los Altos, through Los Altos Hills, past the home of Toyota's head of assisted-driver safety (I can see it in the video), and arriving at Tesla's headquarters in Palo Alto.
And just below the video, the webpage shows this:
For years up to this very day, in every possible way, Musk and Tesla have given the impression that Tesla [drivers] can engage the Autopilot, literally fall asleep, and the Tesla will navigate safely to the destination. This is incredibly dangerous because IT'S NOT TRUE!
And as Anton says, the consequences could be big:
This WSJ video may just be what finally wakes up the powers in Washington, D.C. Tesla can forget about having robotaxis approved now, for many years to come.
The real question is: will Musk and Tesla's board be held responsible for misleading the public? Will there be an indictment coming, not only against the company, but against Musk?
This is the kind of journalistic revelation that should cause a Congressional hearing. Has Musk ever had to testify under oath in front of Congress? The Senate has a Democratic majority and even in the House, where Democrats are in the minority, they could still request a hearing. Surely they could persuade at least a few Republicans to join them in calling for a hearing. After nearly a decade of Autopilot/"self-driving" promises that haven't panned out, why hasn't this happened yet?
The WSJ would never publish something like this without asking Tesla for comment. So Musk knew this story was coming. Perhaps that's why he sounded so defensive and uptight in his prepared remarks on last week's earnings call. This has clearly been brewing for months, maybe more than a year. I can smell some form of indictment coming...
My estimate from a decade ago remains: if full self-driving (driverless, robotaxi, call it whatever you want) is ever going to work properly, it will happen sometime after 2050. The technology for true self-driving – a vehicle without a steering wheel that can go anywhere, anytime, under any conditions – remains at least 30 or so years away.
3) In Tesla's defense, my analyst Kevin DeCamp – a longtime Tesla bull – wrote to Anton and me in our private text group:
The WSJ video seems like a bombshell, but it's not – though it's obviously worth discussing.
All the accidents the WSJ examined occurred with outdated Autopilot level 2 software, which has its flaws, but likely saved many lives.
A lot of auto accidents are due to texting, drinking, distracted and drowsy driving, etc. Autopilot solves almost all of these.
But saved lives don't make headlines...
Anton disagreed, writing:
If you are driving while texting, drinking, etc., you are not going to be able to keep Autopilot from driving you into oncoming traffic or rear-ending an 18-wheeler.
4) For a longer defense of Tesla, I came across this extended post on X – which argues that the WSJ story is "misinformed and out of date" and continues:
...Autopilot is a driver assistance system that cannot prevent every crash. It can prevent many crashes, including some a human might have missed – but it is not a 100% invincibility shield against anything that could happen. You still have to pay attention and be ready to [take over].
It is fully expected that the system will not be able to catch everything. That is why we have driver attention monitoring, which has recently been completely revamped.
That's the thing about these so called "experts". They are not paying attention at all to the rapid advancements the system is making today. They show examples from years ago, almost all from the old Autopilot stack despite the fact that it has been completely reengineered with FSD...
Absolutely, we need to raise awareness about the limitations of these technologies and not let people get a false sense of security. But blaming the technology when all the data we have suggests far fewer crashes and fatalities per mile when properly supervised is just misinformed.
5) Anton isn't buying these arguments, writing in our private text group:
There is so much to say in response to this tweet, I'd have to write an entire article. He seems upset that people have expectations that Tesla's systems are supposed to work. Tesla posted its "Paint It Black" video [in late 2016 and about Autopilot] – clearly setting the expectations – not on current hardware/software, but on what was available in the fourth quarter of 2016.
How many hundreds of people have died since 2016 because they became overconfident by watching Tesla's propaganda about how safe it is to use Autopilot?
6) But before seeing all this and rushing to short TSLA shares (which I've warned my readers for the past decade is a very dangerous short), be aware that big negative headlines can be a contra-indicator for the stock...
Consider this super-bearish article in the WSJ back on April 5: The Inside Tale of Tesla's Fall to Earth. Excerpt:
Tesla Chief Executive Elon Musk has spent years trying to build the automaker of the future. It's the electric-car company of the present that's now giving him trouble.
After a period of rapid expansion, the company has seen its sales fall and its once-enviable margins shrink. For the first time in years, the biggest question for Tesla is not whether it will be able to make enough cars, but whether people will buy them.
The company's stock, down 34% this year, has been the worst performer in the S&P 500 index. While Tesla remains the world's most valuable automaker by a wide margin, its market capitalization has tumbled by more than half since it peaked in 2021.
As the article notes, Tesla was facing several issues at once:
Consumer appetite for electric vehicles is cooling. The core of Tesla's lineup is dated. The company has been cutting prices to spur demand. Big, moonshot bets have not panned out as Musk predicted – at least not yet. And Chinese carmakers are now the ones that look like nimble, tech-savvy upstarts.
Additionally, Musk seems to have had his attention elsewhere...
Keep in mind that he had already bought Twitter (which he rebranded to X), had been selling Tesla stock, and had started an AI business. Meanwhile, as the WSJ article notes, he had "been picking fights with everyone from OpenAI to Disney Chief Executive Bob Iger."
And then, as the article continues:
This week, [Tesla] reported its first year-over-year decline in quarterly deliveries since 2020 – a result that badly missed Wall Street's expectations.
Surely the stock must have continued to fall in the nearly four months since then, right?
Not so fast...
As you can see in this chart, even after the past week's pullback, it's up 35% versus less than 5% for the S&P 500 since then through yesterday's close:

In summary, I would expect Musk and Tesla will face a lot of scrutiny for overpromoting the capabilities of FSD in a dangerously misleading way. But I still don't think Tesla is a good short – or a good long.
You don't have to be a hero by staking out a position in this battleground stock – just sit back, watch, and learn.
Best regards,
Whitney
P.S. I welcome your feedback – send me an e-mail by clicking here.

