A federal report released today reveals that Tesla's autopilot system was involved in at least 13 fatal accidents in which drivers misused the system in ways the automaker should have anticipated — and done more to prevent. Not only that, but the report calls Tesla an “industry outlier” because its driver assistance features lacked some of the basic precautions taken by its competitors. Regulators are now wondering whether a Tesla Autopilot Update designed to solve these fundamental design problems and prevent fatal incidents has gone quite far.
These fatal accidents left 14 dead and 49 injured, according to data collected and published by the National Highway Traffic Safety Administration, the federal highway safety regulator in the United States.
At least half of the 109 “head-on plane” crashes scrutinized by government engineers – those in which a Tesla crashed into a vehicle or obstacle directly in its path – involved hazards visible five seconds or more before the impact. That's enough time that an attentive driver could have prevented or at least avoided the worst of the impact, government engineers concluded.
In one of these accidents, a March 2023 incident in North Carolina, a Model Y traveling at highway speeds struck a teenager while exiting a school bus. The teen was airlifted to a hospital for treatment of his serious injuries. NHTSA concluded that “the bus and pedestrian would have been visible to an attentive driver and would have allowed the driver to avoid or minimize the severity of this crash.”
Government engineers wrote that, throughout their investigation, they “observed a pattern of preventable crashes involving hazards that would have been visible to an attentive driver.”
Tesla, which disbanded its public affairs department in 2021, did not respond to a request for comment.
Damningly, the report calls Tesla an “industry outlier” in its approach to automated driving systems. Unlike other automakers, the report said, Tesla let Autopilot work in situations it wasn't designed for and failed to pair it with a driver engagement system that required its users to pay attention to the road.
Regulators concluded that even the Autopilot product name was problematic, encouraging drivers to rely on the system rather than collaborate with it. Automotive competitors often use the language “assist,” “sense” or “team,” the report said, particularly because these systems are not designed to fully drive themselves.
Last year, California state regulators accused Tesla of false advertising its Autopilot and fully autonomous driving systems, alleging that Tesla misled consumers into believing the cars could drive themselves. In a depotTesla said the state's failure to object to the Autopilot brand for years was an implicit endorsement of the automaker's advertising strategy.
NHTSA's investigation also concluded that, compared to competing products, Autopilot was resilient when drivers tried to steer their vehicles themselves – a design, according to the agency. wrote in its summary of a nearly two-year investigation into Autopilot, which discourages drivers from participating in driving work.
A new autopilot probe
These accidents occurred before Tesla recalled and updated its Autopilot software via an over-the-air update earlier this year. But in addition to closing this investigation, regulators have also opened a new investigation into whether Tesla's updates, released in February, did enough to prevent drivers from misusing Autopilot, misunderstanding when the feature was actually being used, or using it in places where it is not designed to work.
The review comes after a Washington state driver last week said his Tesla Model S was on autopilot – while he was using his phone – when the vehicle struck and killed a motorcyclist.