National Transportation Safety Board investigators are examining the death of 49-year old Elaine Herzberg, killed by an Uber self-driving SUV while crossing a Tempe, Arizona road on March 18. Just two weeks after Herzberg’s death, her daughter and husband have reached a settlement with Uber, allowing the company to avoid a trial.
As investigators examine what caused the first fatal accident involving a self-driving vehicle and civilian, we take a look at who could be liable for this tragic event, and whether self-driving vehicle sensor software failed - or succeeded.
Uber Self-Driving SUV Kills Tempe Pedestrian
Around 10pm on Sunday, March 18, an Uber self-driving Volvo XC90 SUV traveling at 40 miles per hour hit and killed Elaine Herzberg as she was walking her bicycle across North Mill Avenue in Tempe, Arizona. Accident reports show the car made no attempt to brake.
The Volvo XC90 SUV was equipped with several types of sensors. Uber’s Light Detection and Ranging (LiDAR) sensors create a 360-degree map of the Volvo’s surroundings. In addition to LiDAR, the car is also outfitted with stereo and radar sensors that detect objects around the vehicle.
Each of these sensor types should have detected Herzberg crossing the street in front of the vehicle. The fact that she was walking a bicycle should have increased her visibility.
Did Uber Program Autonomous Volvo to Ignore Pedestrians?
Experts agree that Uber’s LiDAR sensors shouldn’t have had any trouble seeing Herzberg crossing the road. Perhaps the sensors on Vasquez’s Volvo XC90 failed.
But there is another disturbing possibility. The sensors could have been programmed to ignore the pedestrian.
Self-driving vehicle manufacturers have to ask some unsettling questions when deciding what objects these cars should avoid, especially when it comes to pedestrians.
Just as human drivers must decide whether to swerve into a lamp post at 40 mph or hit a person crossing the road in front of them, self-driving vehicles must be programmed to decide whether to avoid a pedestrian or to hit them.
In other words, car manufacturers must decide whether to protect drivers or pedestrians.
Engineers usually avoid commenting on this ethical puzzle, except for Mercedes. In October 2016, Mercedes announced that it would program its self-driving vehicles to sacrifice pedestrians and protect occupants. “If you know you can save at least one person, at least save that one. Save the one in the car,” said Christoph von Hugo, Mercedes’ manager of driver assistance systems. “If all you know for sure is that one death can be prevented, then that’s your first priority.”
Studies report that a majority of people feel it would be ethically correct to sacrifice the driver over pedestrians. After all, the driver chose to buy the car. But the people surveyed also said they wouldn’t buy a car that protected pedestrians before occupants.
Other companies testing self-driving cars won’t even mention the dilemma, but it will be a factor in lawsuits surrounding future injuries and deaths caused by autonomous cars. Engineers must make the decision at some point. Avoid the tree, or what may be a child playing in the road?
Uber Operator Vasquez’s Eyes Off the Road During Crash
This ethical problem is only one of a flood of unprecedented legal challenges the American transportation scene faces as autonomous vehicles roll in.
Self-driving vehicle pedestrian accident cases must weigh the responsibility of the victim, the operator, the vehicle manufacturer, and sensor software and hardware manufacturers.
In this case, video footage shows that the operator of the car who hit Herzberg, 44-year old Rafaela Vasquez, was repeatedly looking down at something in her lap (likely a mobile device) when the accident occurred.
The rules for Uber self-driving vehicle test drivers? No digital devices while cars are in motion, keep your eyes on the road, and keep your hands hovering at the wheel at all times. Yes, Vasquez could likely be liable in part for the death of Herzberg.
In addition, Silicon Valley-based Velodyne LiDAR, creator of LiDAR sensors for Volvo, Ford, and Mercedes-Benz, could be held liable in part - whether sensors were programmed to ignore or to swerve around pedestrians.
In March 2017, though the self-driving Volvos were failing to meet expectations and having more problems than competitors, Arizona officials agreed to supply Phoenix-area public roads as test fields. Several reports suggest Arizona Governor Doug Ducey approved Uber’s testing on public roads without requiring that Uber disclose how the cars were performing. This raises the possibility that the State of Arizona is also at fault.
Uber Suspends Self-Driving Vehicle Testing Indefinitely
The Herzberg case would have been the first to tackle the numerous liability challenges that lie ahead for injury and fatality cases involving autonomous vehicles. But so far, Uber has avoided having to address much of anything. Just two weeks after the fatal accident, the daughter and husband of Elaine Herzberg reached a settlement with the company.
Terms of the settlement have not been disclosed.
Tempe police and investigators from the National Highway Traffic Safety Administration and National Transportation Safety Board are currently analyzing the accident and contributing factors.
In the wake of this tragedy, Uber has suspended North American testing indefinitely. Both Arizona and California have withdrawn permission for public road testing of Uber self-driving vehicles.
Other companies involved in autonomous vehicle design like Nvidia Corp. and Toyota Motor Corp. have also stopped testing on public roads until they hear what investigators claim caused the tragic death.
To learn more about pedestrian safety, download our free book, Florida Pedestrian and Cyclist Injury Lawsuits. Contact Us at 954.522.6601 or Connect Online.
John Uustal is a Florida based trial lawyer with a national practice. His landmark cases against U.S. automakers have resulted in safer vehicles for consumers across the United States. Connect with John - [hidden email].