The National Highway Traffic Safety Administration (NHTSA) is investigating a crash in Detroit involving a Tesla vehicle that may have been in Autopilot mode. The vehicle reportedly drove beneath a semitrailer, critically injuring two people. Local police say they’re still investigating whether the driver was using the Autopilot feature or not.
The agency called the incident a “violent crash,” one eerily similar to two previous crashes in Florida in which Teslas drove beneath tractor-trailers, causing two fatalities. In one of those, the regulators found that the Tesla Autopilot was not at fault.
In both Florida crashes, the vehicles were being driven with the assistance of Tesla’s Autopilot feature, which is a partially automated driving software. When the company first released Autopilot, it said the feature was designed to “give you more confidence behind the wheel, increase your safety on the road, and make highway driving more enjoyable.”
However, that was six years and several accidents ago. Now, the company has scaled back its language to say that Autopilot is “a hands-on driver assistance system that is intended to be used only with a fully attentive driver,” adding that it “does not turn a Tesla into a self-driving car nor does it make a car autonomous.”
No such thing as a “self-driving car”
In its investigation of previous Tesla Autopilot-related crashes, the National Transportation and Safety Board (NTSB) cited the shortcomings of these new automotive technologies. “This tragic crash clearly demonstrates the limitations of advanced driver assistance systems available to consumers today,” said NTSB Chairman Robert Sumwalt.
“There is not a vehicle currently available to U.S. consumers that is self-driving. Period. Every vehicle sold to US consumers still requires the driver to be actively engaged in the driving task, even when advanced driver assistance systems are activated. If you are selling a car with an advanced driver assistance system, you’re not selling a self-driving car. If you are driving a car with an advanced driver assistance system, you don’t own a self-driving car,” Sumwalt said.
Sumwalt recently went after the NHTSA for its “continued failure to recognize the importance of ensuring that acceptable safeguards are in place” when it comes to testing advanced driver assistance systems. He said conditions like roadway type, geographic location, clear roadway markings, weather conditions, speed range, and lighting conditions can all hinder the ability of these systems to operate safely.
Despite these limitations, Sumwalt pointed out in a February 1 letter to the Department of Transportation that vehicle manufacturers can operate and test vehicles virtually anywhere. He said that’s only possible because the NHTSA has no requirements in place.
Sumwalt cited Tesla’s recent beta version of its Level 2 Autopilot system, which is described as having full self-driving capability, as an example.
“By releasing the system, Tesla is testing on public roads a highly [automated vehicle] technology but with limited oversight or reporting requirements. Although Tesla includes a disclaimer that ‘currently enabled features require active driver supervision and do not make the vehicle autonomous,’ NHTSA’s hands-off approach to oversight of AV testing poses a potential risk to motorists and other road users,” he said.