Uber has reportedly concluded that sensors in one of its self-driving cars saw an Arizona pedestrian in the road but failed to take action to avoid hitting her.
The March accident claimed the life of Elaine Herzberg and triggered a National Transportation Safety Board (NTSB) investigation into what caused the accident. The Information, a technology publication, cites two people who have been briefed on Uber's own investigation as saying that the sensors saw the pedestrian but didn't recognize her as an object to be avoided.
The sources say the car's object-avoidance software can be adjusted so that the car will make every effort to avoid hitting a solid object but not swerve to miss a paper bag floating in its path.
According to The Information, the Uber investigation concluded that the car's sensors saw the pedestrian pushing her bicycle in the roadway, but the on-board computer system “decided it didn't need to react right away.”
'Top-to-bottom safety review'
Because investigations are still underway, neither Uber nor the NTSB will comment on the report. However, a spokesman for Uber told TechCrunch that the company has started “a top-to-bottom safety review” of the ride sharing company's self-driving car program.
How an autonomous vehicle sees objects in the roadway could play a critical role in determining when and if the technology is deemed safe to be fully deployed on U.S. highways. Right now, it's something of a controversial question.
Last fall, a coalition of safety and consumer groups lined up against bipartisan legislation in Congress to fast-track autonomous car testing on public roads. Joan Claybrook, a former administrator of the National Highway Traffic Safety Administration (NHTSA), said the legislation ignores recent history, including mistakes made by the auto industry.
"It puts auto and tech companies who basically wrote the bill in the driver's seat in the development and deployment of unproven autonomous vehicles," Claybrook said at the time. "It puts the federal auto safety agency in the back seat in terms of ensuring industry accountability."
In a November interview with ConsumerAffairs, automotive expert Scot Hall, CEO of Swapalease, expressed doubts that current technology is capable of making choices that drivers are sometimes required to make.
“What if a self-driving car encounters a dangerous situation where there are two alternatives, and neither of them is good?” Hall asked. “How does the computer make that choice?”
That's a question not only for technology company engineers, but also policymakers, who must decide when fully autonomous cars are safe to operate on public roadways without any human intervention.