By Sydney Maki and Alexandria Sage
TEMPE, Ariz./SAN FRANCISCO (Reuters) – Police in Arizona on Wednesday released a short video of a fatal collision between an Uber self-driving vehicle and a pedestrian, as investigators probe the accident that has put new focus on the safety of autonomous vehicles.
The video, taken from inside the Volvo XC90 sport utility vehicle that Uber has used for testing, shows the vehicle driving along a dark road when an image of a woman walking a bicycle across the road suddenly appears in the headlights.
The woman, Elaine Herzberg, 49, later died from her injuries.
Police have released few details about the accident that occurred on Sunday night in Tempe, Arizona, a suburb of Phoenix, while the SUV was driving in autonomous mode. Uber suspended its self-driving testing in North America after the incident and federal safety regulators are conducting their own probe.
Fall-out from the accident could stall the development and testing of self-driving vehicles, which are designed to perform far better than human drivers and sharply reduce the number of motor vehicle fatalities that occur each year.
The video shows the vehicle traveling in the right-hand lane of a divided four-lane roadway. The vehicle’s headlights illuminate a woman directly in front of it who is crossing the SUV’s lane with her bike. The woman appears to be jaywalking as she is not in a crosswalk.
A photo released by safety regulators on Tuesday showed that the impact occurred on the right side of the vehicle.
The footage also shows a view of the vehicle’s interior and the driver at the wheel. The driver appears to be looking down, and not at the road, for two periods of about five seconds each. Just before the video stops, the driver looks upward toward the road and suddenly looks shocked.
“The video is disturbing and heartbreaking to watch, and our thoughts continue to be with Elaine’s loved ones,” Uber said in a statement. “Our cars remain grounded, and we’re assisting local, state and federal authorities in any way we can.”
The video is likely to be a key part of investigations of Uber’s self-driving car technology and whether it was ready for testing on public roads.
Although the exact specifics of Uber’s technology are not known, self-driving cars typically use a combination of sensors, including radar and light-based Lidar, to identify objects around the vehicle, including potential obstacles coming into range. While cameras do not perform well in the dark, radar and Lidar can work at night.
One question on regulators’ minds will be why the sensors did not pick up on the presence of Herzberg, who would ostensibly have already crossed three lanes of traffic before arriving in the path of the Uber vehicle.
One self-driving car expert, Bryant Walker Smith, said his first impression was of “outrage” viewing the video.
“Although this video isn’t the full picture, it strongly suggests a failure by Uber’s automated driving system and a lack of due care by Uber’s driver (and by the victim),” said Smith, a professor of law at the University of South Carolina.
Another autonomous driving expert agreed with Smith’s assessment.
“The sensors should have detected the pedestrian in this case; the cameras were likely useless but both the radars and the Lidar must have picked up the pedestrian,” said Raj Rajkumar, a professor at Carnegie Mellon.
“Though no information is available, one would have to conclude based on this video alone, that there are problems in the Uber vehicle software that need to be rectified,” he said.
Uber did not immediately respond to a request for comment on its systems.
The video is likely to renew calls for more oversight in a nascent industry that lacks standardized testing or safety definitions. Lawmakers have had to juggle the need to encourage innovations that promise to dramatically improve safety on roads with current public safety concerns.
Companies including Uber, Alphabet’s Waymo
On Tuesday, Arizona transportation officials said they saw no immediate need to tighten rules on the testing of self-driving cars in the state.
Although some within the self-driving industry have suggested agreeing testing and safety standards for autonomous technology, there has been no concerted effort to do so.
Timothy Carone, an associate teaching professor at Notre Dame University’s Mendoza College of Business whose research specialties include artificial intelligence and autonomous systems, said the question is whether Uber did enough testing before sending robot cars out onto streets alongside humans.
“Did they jump the gun?” he said. “If their testing is found to be inefficient, that cannot be allowed to happen again because these systems have to be ready for road tests.”
(Additional reporting by Paul Lienert and Nick Carey in Detroit; Editing by Rosalba O’Brien, Peter Cooney and Cynthia Osterman)