By Michael Laris

The Washington Post

The self-driving Uber that struck and killed a pedestrian in March initially misidentified the woman as a vehicle and was deliberately put on the road without its emergency braking system turned on, federal investigators said Thursday.

“According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior,” according to a preliminary report by the National Transportation Safety Board.

The human back-up driver did not begin braking until after 49-year-old Elaine Herzberg was hit crossing a dark Tempe, Arizona, thoroughfare, the NTSB said.

Federal safety investigators have not given a cause for the crash. But they described a series of initial findings that raise far-reaching questions about Uber’s decision making, engineering and approach to safety as it worked to develop a potentially lucrative driverless system on public roads.

The Uber 2017 Volvo XC90 that killed Herzberg comes factory-equipped with an automatic emergency braking function, called City Safety, according to the NTSB. Uber disables that and several other safety features when the car is being controlled by the company’s self-driving system, but keeps them on when the car is being driven by a person.

Uber has indicated that the Volvo crash mitigation systems are designed to assist drivers, not to be part of a self-driving system.

In response to questions, an Uber spokeswoman provided a statement that did not address specific findings. It said the company has worked closely with the NTSB and is reviewing the safety of its program. “We’ve also brought on former NTSB Chair Christopher Hart to advise us on our overall safety culture, and we look forward to sharing more on the changes we’ll make in the coming weeks,” the statement said.

According to the NTSB, the Uber SUV’s sensors first detected Herzberg “about six seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path.”

Then, at 1.3 seconds before Herzberg was hit, “the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision,” the NTSB said. But since emergency braking maneuvers were “not enabled” by Uber, it’s up to the human safety driver to take over. The NTSB, without comment, outlined the inherent disconnect in Uber’s procedures.

“The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator,” according to the preliminary report.

One reason Uber would have disabled automatic emergency braking in its self-driving cars is to avoid what can be a herky-jerky ride when the cars’ cameras or sensors keep seeing potential problems ahead and braking, experts said. The NTSB said Uber was seeking to reduce “erratic vehicle behavior.”

Uber may have been seeking” to reduce the number of ‘false positives,’ where the computer potentially misclassifies a situation and the automatic emergency braking engages unnecessarily,” said Constantine Samaras, a robotics expert and assistant engineering professor at Pittsburgh’s Carnegie Mellon University. “False positives like that could also be dangerous, especially at higher speeds.”

“The car saw the pedestrian six seconds before impact but misclassified them until 1.3 seconds before impact. Even at that point, the computer determined that emergency braking was needed, but the function was disabled and there is no mechanism to alert the driver,” he said.

“We know that humans are a terrible back-up system. We’re easily distracted, and we have slower reaction times,” Samaras added. “Alerting the driver to these types of situations before the crash seems like a no-brainer.”

The safety driver, Rafaela Vasquez, is seen in a Tempe police video looking down several times just before the crash. She said she was looking at elements of Uber’s self-driving system, and not her cellphones, which she said she did not use until she called 911. The NTSB continues to investigate that and other elements of the crash, and added that “the operator is responsible for monitoring diagnostic messages that appear on an interface in the center stack of the vehicle dash and tagging events of interest for subsequent review.”

Vasquez was not tested for alcohol or drugs, but police said she showed no signs of impairment.

The victim, Herzberg, tested positive for marijuana and methamphetamine, according to the NTSB. She was dressed in dark clothes, walking outside the crosswalk. The bicycle she was pushing had reflectors, the NTSB said, but they were facing away from the oncoming Uber.

Data retrieved after the crash “showed that all aspects of the self-driving system were operating normally at the time of the crash,” investigators said, and there were no error messages.

Investigators said Uber’s system is not designed to warn the safety driver that he or she should stop the car in this situation. That decision echoes shortcomings in recent decades in other fields as people have increasingly relied on automation, according to Duke University robotics expert Missy Cummings.

“This lesson has been written in blood over and over and over again,” said Cummings, director of the university’s Humans and Autonomy lab. She cited the Three Mile Island nuclear accident in 1979, as well as several airplane crashes, saying in both circumstances, engineers decided not to give human operators critical information they needed to try to prevent tragedies.

20387266