Lawsuit charges Tesla of misleading consumers about safety of its Autopilot feature

Photo (c) Sjo - Getty Images

Before Joshua Brown died, Gao Yaning crashed into a street-sweeper in his Model S

The death of Florida Model S owner Joshua Brown in May 2016 is widely regarded as the first known fatality that occurred while a car’s autonomous technology was engaged.

But a new report suggests that a death linked to Tesla’s hyped Autopilot feature may have occurred six months earlier. The death happened in China, where Tesla has reportedly sold over $1 billion worth of vehicles.

Gao Yaning, 23, died on his way home from a wedding reception in January 2016 while driving his father’s Model S in the province of Hebei. Surveillance footage shows the car crashing into the back of a road-sweeping truck on the highway.

In the two years since, his father Jubin Yaning has been fighting Tesla in court in an attempt to prove that the company is misleading customers about the safety and sophistication of its Autopilot feature.  

Jubin is asking for 5 million in yuan, the equivalent of about $750,000, the site Jalopnik reports. Jubin told Jalopnik that if he wins the suit, he will fund a charity “to warn more Tesla owners not to use Autopilot.”

Failed to follow “operation rules”

When Gao’s father Jubin Yaning first filed his suit in 2016, Tesla claimed that the car was too damaged to determine whether Autopilot was in fact engaged.

But in a statement to Jalopnik reporter Ryan Felton, Tesla seemed to confirm the possibility, making claims about the safety of Autopilot without stating whether or not it was in use during the time of Gao’s crash.

Tesla said that Yaning failed “to drive safely in accordance with operation rules” but added that it has agreed to allow a third-party appraiser to review data from the vehicle.

“While the third-party appraisal is not yet complete, we have no reason to believe that Autopilot on this vehicle ever functioned other than as designed,” Tesla told Jalopnik.

Not considering the human element

Tesla has repeatedly claimed that its Autopilot feature improves vehicle safety, but it also says the owner’s manual warns that the feature is in beta-testing. Even when activated, drivers must pay attention and keep their hands on the wheel at all times.

After Joshua Brown died, Tesla told Wired magazine that , "the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety." But at the dealership, consumers and regulators say that Tesla may be selling drivers on the idea that the Autopilot feature is more sophisticated than it actually is.

In investigating Brown’s death, the National Transportation Safety Board determined last year that while Brown bore responsibility for not paying enough attention, Tesla had not adequately considered the “human element” of so-called self-driving technology.

“This crash is an example of what can happen when automation is introduced ‘because we can’ without adequate consideration of the human element,” the board wrote.

“In aviation, airline pilots know that even when the autopilot is controlling their airplane, the pilots still play a crucial role. Joe and Suzy Public, on the other hand, may conclude from the name ‘autopilot’ that they need not pay any attention to the driving task because the autopilot is doing everything.”

The car’s forward collision warning  and automatic emergency braking systems also failed to activate shortly before the crash, the board’s investigation found.

Mounting complaints

A class action lawsuit filed against Tesla over its Autopilot technology last year claimed that the Model S and Model X cars made a number of dangerous maneuvers when Autopilot was activated, such as “lurching, slamming on the brakes for no reason, and failing to slow or stop when approaching other vehicles.”

The company responded to the lawsuit by telling Bloomberg News that “we have never claimed our vehicles already have functional ‘full self-driving capability,’....The inaccurate and sensationalistic view of our technology put forth by this group is exactly the kind of misinformation that threatens to harm consumer safety.”

Complaints about Tesla cars making dangerous driving maneuvers by themselves have also surfaced from drivers who were not even using the car’s Autopilot technology. ConsumerAffairs reported earlier this month on five cases in which drivers described their car suddenly taking off by itself as the driver was slowly parking.

Quick and easy. Get matched with an Auto Warranty partner.