Tesla CEO Elon Musk is seen during a meeting with France's Finance Minister Bruno Lemaire outside Paris, France, on May 15, 2023. The French government is attempting to attract foreign industrial investment.
Back in 2016, Tesla CEO Elon Musk promised on Twitter that his self-driving technology would allow one of his cars to drive across the country unaided “by next year,” and that owners would be able to summon their cars from “anywhere connected by land & not blocked by borders, eg you're in LA and the car is in NY.” A legion of fanboys dreamed of monetizing their Teslas into robot taxis, and being in the vanguard of the transportation revolution.
You might know by now that none of that ever happened. It later was discovered that Musk personally oversaw the production of a 2016 video that deliberately exaggerated the capabilities of Tesla’s Autopilot system. Nevertheless, Tesla has made Autopilot a standard feature in its cars, and more recently, rolled out a more ambitious “Full Self-Driving” (FSD) systems to hundreds of thousands of its vehicles.
Now we learn from an analysis of National Highway Traffic Safety Administration (NHTSA) data conducted by The Washington Post that those systems, particularly FSD, are associated with dramatically more crashes than previously thought. Thanks to a 2021 regulation, automakers must disclose data about crashes involving self-driving or driver assistance technology. Since that time, Tesla has racked up at least 736 such crashes, causing 17 fatalities.
This technology never should have been allowed on the road, and regulators should be taking a much harder look at driver assistance features in general, requiring manufacturers to prove that they actually improve safety, rather than trusting the word of a duplicitous oligarch.
The primary defense of FSD is the tech utopian assumption that whatever its problems, it cannot possibly be worse than human drivers. Tesla has claimed that the FSD crash rate is one-fifth that of human drivers, and Musk has argued that it’s therefore morally obligatory to use it: “At the point of which you believe that adding autonomy reduces injury and death, I think you have a moral obligation to deploy it even though you’re going to get sued and blamed by a lot of people.”
Yet if Musk’s own data about the usage of FSD are at all accurate, this cannot possibly be true. Back in April, he claimed that there have been 150 million miles driven with FSD on an investor call, a reasonable figure given that would be just 375 miles for each of the 400,000 cars with the technology. Assuming that all these crashes involved FSD—a plausible guess given that FSD has been dramatically expanded over the last year, and two-thirds of the crashes in the data have happened during that time—that implies a fatal accident rate of 11.3 deaths per 100 million miles traveled. The overall fatal accident rate for auto travel, according to NHTSA, was 1.35 deaths per 100 million miles traveled in 2022.
In other words, Tesla’s FSD system is likely on the order of ten times more dangerous at driving than humans.
Now, this is just a preliminary study, so the exact figures will be somewhat different. But even if just half of the deaths NHTSA found were thanks to FSD, that still would make it dramatically more deadly than humans. Indeed, the true figure could be higher, since there is no way to know if Tesla is reporting every crash accurately.
The Post report is full of crashes that a human driver should have been able to avoid, like running into a motorcycle that the automatic system apparently failed to recognize.
This result shouldn’t be surprising. When it comes to FSD, Elon Musk’s decision to remove LIDAR (light detection and ranging) sensors and rely exclusively on cameras to see where the car is going have made the system demonstrably worse. That’s surely part of the reason why Teslas account for fully 91 percent of all self-driving-related crashes in the NHTSA data.
But there are strong reasons to be suspicious of any technology that can take full control of the car—as opposed to lane assist or automatic braking—while still needing human assistance on occasion. First, as any driving instructor in a car with a second set of controls knows, it is actually more difficult to serve as an emergency backup driver than it is to drive yourself. Instead of your attention being fully focused on driving the car, you are waiting on tenterhooks to see if you need to grab the wheel—and if that happens, you have to establish instant control over a car that may already be in motion, or in a dangerous situation.
Marques Brownlee, a prominent tech journalist on YouTube who is fairly positive about Tesla, published a video a few months ago exploring how the FSD system would work in taking his Model S Plaid—which starts at $94,990 in 2023, by the way—to the office. He had to intervene on a couple occasions when the car missed an exit and tried to take a toll lane blocked by a cone. “This is stressful,” he said. “This is not what you would hope, which is taking the stress out of driving … this is a lot more making sure the car doesn’t do weird things.” (In another video, he noted that the Plaid’s right turn signal, which as usual for Tesla is a touchscreen button instead of the traditional stalk, often fails to work.)
Second, despite Tesla’s insistence that drivers always be alert when using FSD, it is plainly obvious that many are not. Without the necessity of being in constant control at the wheel, attention tends to wander even when one is trying to remain vigilant. The Post report is full of crashes that a human driver should have been able to avoid, like running into a motorcycle that the automatic system apparently failed to recognize. (Killed motorcyclists account for nearly a quarter of the fatalities.) Other drivers may be sleepy or drunk, hoping the system can get them home.
Others, perhaps taken in by Musk’s overheated hype about the quality of Tesla technology, simply trust the system mindlessly. Two men died in 2019 when they let their Tesla drive around with nobody in the driver’s seat and it crashed into a tree. Another was fiddling with his phone when his Tesla ran a stop sign and rear-ended another car, killing the driver.
Indeed, Tesla’s insistence that drivers always pay attention is as much protection from legal liability as it is anything else. All “state laws hold the human driver responsible for the operation of their vehicles,” an NHTSA spokesperson told the Post—a very convenient shield.
At any rate, it is long since time that the NHTSA and other regulators took a much harder line with automakers. We need more regulations to protect pedestrians from oversized trucks, stricter ways to control rampant speeding, and a great deal more skepticism about whiz-bang computer gimmicks. And Elon Musk in particular needs to be taken down a peg. For far too long he has barged ahead with dangerous, half-baked technology in Teslas, lying out of both sides of his mouth, relying on his inflated Tony Stark reputation and regulator timidity to get away with it. People are dead as a result, and it needs to stop.