Tesla recalls 2M vehicles over Autopilot crash concerns — as tech linked to fatal Virginia wreck
[ad_1]
Tesla will recall more than two million vehicles, almost the entire fleet sold in the US, to fix a flaw in its “Autopilot” assisted-driving system – a move that came as Virginia officials found a car had the software enabled during a fatal crash last July.
The recall – reportedly the largest in Tesla’s history – emerged as part of a still-ongoing investigation by the National Highway Traffic Safety Administration.
The probe, which began more than two years ago and included reviews of 956 crashes that allegedly involved Autopilot, determined that its existing safeguards “may not be sufficient to prevent driver misuse” of the software.
“In certain circumstances when Autosteer is engaged, and the driver does not maintain responsibility for vehicle operation and is unprepared to intervene as necessary or fails to recognize when Autosteer is canceled or not engaged, there may be an increased risk of a crash,” the NHTSA said in a release.
The electric car maker said the recall would consist of an over-the-air software update that was expected to roll out beginning on Tuesday or slightly afterward. The update will be applied to Tesla Model 3, Model S, Model X and Model Y vehicles manufactured in certain years, including some dating back to 2012.
The vehicles will receive “additional controls and alerts” prompting drivers to pay attention when using Autopilot, including by keeping both hands on the steering wheel and watching the road.
Tesla shares sank more than 1.5% in trading on Wednesday.
The announcement emerged on the same day that authorities in Virginia revealed that Autopilot was in use when Pablo Teodoro III, 57, fatally crashed his Tesla into a tractor-trailer. Officials also determined that the Tesla was speeding before the wreck.
A spokesman for the Fauquier County Sheriff’s Office said it appeared that Teodoro took action in the second before the crash, though it was unclear what he did.
The investigation also found that the car’s system “was aware of something in the roadway and was sending messages.”
The NHTSA is still investigating the crash.
The recall also followed a scathing Washington Post report which alleged that Tesla was allowing the use of Autopilot in areas the software was not designed to handle.
The outlet alleged it had found “at least eight fatal or serious wrecks involving Tesla Autopilot on roads where the driver assistance software could not reliably operate,” such as roads with hills or sharp turns.
Tesla defended the safety of its Autopilot software in a lengthy X post in response to the article, asserting that statistics show cars are safer when it is engaged versus when it is not and that the company has a “moral obligation to continue improving our already best-in-class safety systems.’’
“The data is clear: The more automation technology offered to support the driver, the safer the driver and other road users,” the company said.
Tesla boss Elon Musk has repeatedly said Autopilot is safe to use – and touted the company’s efforts to develop assisted-driving and fully-automated driving features as a key part of its long-term plans.
With Post wires
[ad_2]
Source link