On Wednesday, the Center of Auto Safety and Consumer Watchdog sent a letter to the Federal Trade Commission (FTC) requesting that the agency investigate how Tesla has marketed its controversial "Autopilot" assisted-driving technology.
The consumer groups called Tesla’s use of the name Autopilot “deceptive and misleading” and argued that advertising the enhanced cruise-control system under the name Autopilot could make consumers think the feature makes a Tesla vehicle self-driving.
"Tesla is the only automaker to market its Level 2 vehicles as 'self-driving,' and the name of its driver assistance suite of features, Autopilot, connotes full autonomy," the letter said.
"The burden now falls on the FTC to investigate Tesla's unfair and deceptive practices so that consumers have accurate information, understand the limitations of Autopilot, and conduct themselves appropriately and safely.”
Customers have misused Autopilot
Despite owner manual disclaimers warning drivers to keep their hands on the wheel at all times while using Autopilot, two known deaths and one injury have occurred as a result of Tesla drivers relying on Autopilot to control their vehicle.
One of the most recent crashes occurred in March. The driver told police she had Autopilot engaged and was reading her smartphone and not watching the road. In Great Britain, a driver had his driving privileges suspended for engaging Autopilot and then leaving the driver’s seat.
The consumer groups are calling on the FTC to examine Tesla's advertising practices surrounding the feature to determine whether the company should be faulted for its customers' misuses of Autopilot.
Deceptive marketing
The letter cited statements made by Tesla founder Elon Musk which seemed to suggest Autopilot is safer than it actually is.
“In an interview with CBS in April 2018, after Musk was asked what the purpose of Autopilot is if drivers still have to touch the steering wheel, he responded ‘because the probability of an accident with Autopilot is just less’,” the groups wrote.
Earlier this month, a woman in Utah crashed her Model S into a fire truck at 60 miles-per-hour while using Autopilot. Following the collision, Musk sent a tweet that said, ‘What’s actually amazing about this accident is that a Model S hit a fire truck at 60mph and the driver only broke an ankle. An impact at that speed usually results in severe injury or death.”
“Thus, Musk conflates Autopilot’s safety, or lack thereof, with the Model S’s ability to withstand a crash,” the letter concluded.
Tesla has said its Autopilot pilot feature results in 40 percent fewer crashes. The National Highway Safety Administration (NHTSA) restated this claim in a 2017 report on the first driver fatality, which occured in May. Earlier this month, the agency said regulators had not actually evaluated the effectiveness of the technology.