WASHINGTON (Reuters) -The head of the U.S. National Transportation Safety Board on Monday criticized electric carmaker Tesla Inc’s decision to provide new self-driving software to vehicle owners without addressing safety concerns that the agency raised after a series of fatal accidents.
NTSB Chair Jennifer Homendy wrote a letter to Tesla Chief Executive Elon Musk about his decision to let drivers request access to “Full Self-Driving Beta technology” before the automaker addressed “the very design shortcomings” implicated in fatal crashes in Florida and California involving its Autopilot driver-assistance system. Beta refers to a trial version of a product.
“If you are serious about putting safety front and center in Tesla vehicle design, I invite you to complete action on the safety recommendations we issued to you four years ago,” Homendy wrote.
The Full Self-Driving technology expands on the Autopilot software that helps with vehicle steering, accelerating and braking.
Tesla did not immediately respond to a request for comment.
The NTSB previously urged Tesla to add system safeguards to limit the use of automated vehicle-control systems to designated conditions and to develop applications to “more effectively sense the driver’s level of engagement.” The agency makes safety recommendations and has no regulatory authority.
Tesla has never officially responded to those recommendations.
Also on Monday, the National Highway Traffic Safety Administration, which regulates auto safety, disclosed that Tesla has given the agency a partial response https://static.nhtsa.gov/odi/inv/2021/INME-PE21020-1022.pdf to an information request issued as part of NHTSA’s formal safety investigation into the automaker’s Autopilot technology.
NHTSA said in an memo written last Friday and released on Monday that Tesla’s response “has been received and is being reviewed.” Tesla has asked NHTSA to keep its entire response confidential.
The agency in August sent Tesla an 11-page letter with questions it was required to answer by last Friday. NHTSA this month asked Tesla why it had not issued a recall to address software updates made to Autopilot to improve the vehicles’ ability to detect emergency vehicles.
NHTSA’s investigation into 765,000 U.S. vehicles came after a series of crashes involving Tesla models and emergency vehicles. NHTSA has identified 12 crashes that involved Tesla vehicles using the advanced driver-assistance systems and emergency vehicles, with most of the incidents occurring after dark.
In a separate letter this month, NHTSA asked Tesla to provide answers by Nov. 1 on its semi-autonomous driving technology called “Autosteer on City Streets” that the company made available to vehicle owners last year. Tesla refers to this technology as FSD.
“Despite Tesla’s characterization of FSD as ‘beta,’ it is capable of and is being used on public roads,” NHTSA said.
Tesla on Sunday rolled back the latest version of this software, less than a day after its release, after users complained of false collision warnings and other issues.
NHTSA also raised concerns about limits that Tesla has compelled vehicle owners to accept on disclosure to the public of safety issues.
(Reporting by David Shepardson; Editing by Will Dunham, Kirsten Donovan and Bernadette Baum)