Highway crashes involving Teslas in Autopilot mode have long been controversial, with the company attributing most of the accidents to operator error.
But a new investigation by the Wall Street Journal tells another story. While the crash data Tesla has submitted to federal regulators is heavily redacted, Tesla matched the reports to data collected by the states. The Journal says these reports call into question the safety of Tesla’s camera-based system.
Teslas operating in Autopilot mode have been involved in hundreds of crashes since 2016. Tesla all along has said the technology is safe.
The National Highway Traffic Safety Administration (NHTSA) opened a new investigation in 2022 after a Tesla crash killed three people. That probe was spurred on by an accident that occurred in Newport Beach, Calif.
The accident involved a Tesla vehicle that reportedly hit a curb and ran into construction equipment, killing all three of the vehicle’s occupants and sending three construction workers to the hospital with non-life-threatening injuries. At the time, local police declined to say whether Tesla's Autopilot was involved or not.
In April, NHTSA issued a report that examined hundreds of accident reports and concluded there was one main reason for all of the accidents – the drivers were putting too much faith in the technology and weren’t paying enough attention to the road.
“Throughout the [two] investigations, ODI (Office of Defects Investigation) observed a trend of avoidable crashes involving hazards that would have been visible to an attentive driver,” the report’s authors wrote. “Before August 2023, ODI reviewed 956 total crashes where Autopilot was initially alleged to have been in use at the time of, or leading up to, those crashes.”
Other issues
Some Tesla owners also have issues with “phantom braking” when the car is in cruise control, not Autopilot. Dane, of Las Vegas, told ConsumerAffairs that his Tesla will sudden brake for no apparent reason on rural highways.
“The highway can be perfectly striped, and with no car in sight,” Dane wrote in a review. “It can also happen when you're passing cars or semi-trucks. I would say 85% to 90% of the time it is a soft break. But sometimes it can be a dangerously sudden hard break.”
In 2021, NHTSA ordered automakers to report all crashes involving semi-autonomous driving systems. The Journal maintains that much of the data submitted by Tesla remains under wraps because the company says it is proprietary information.
Out of about 1,000 crash reports Tesla has submitted, the Journal said it was able to connect more than 200 of them to state reports. Out of that number, the Journal reports that 44 crashes occurred when the car was in Autopilot and veered suddenly. Thirty-one occurred when the car failed to stop or yield to an object in the roadway.