Tesla's Autopilot, FSD systems under scrutiny in US: Here's why
Federal investigations in the US, have raised serious concerns about the safety of Tesla's Autopilot and Full Self-Driving (FSD) systems, following their involvement in numerous accidents and fatalities. A significant incident in March 2023, involved a student being hit by a Tesla Model Y operating on Autopilot mode. The student was severely injured, highlighting potential deficiencies in Tesla's technology and driver distraction issues.
Hundreds of similar cases under investigation
The North Carolina incident, where a 17-year-old student was struck by a Tesla Model Y on Autopilot mode, has sparked an investigation into hundreds of similar cases. The National Highway Traffic Safety Administration (NHTSA) found that drivers using Autopilot or FSD were not adequately focused on driving. Furthermore, Tesla's technology failed to ensure drivers kept their attention on the road, contributing to the accidents.
NHTSA investigated 956 crashes involving Tesla cars
Between January 2018 and August 2023, the NHTSA investigated 956 crashes involving Tesla's Autopilot or FSD systems, resulting in 29 fatalities. Additionally, there were 211 incidents where a Tesla vehicle hit an obstacle or another vehicle head-on, causing 14 deaths and 49 injuries. These investigations were initiated following several instances of Teslas colliding with stationary emergency vehicles parked on roadside areas.
NHTSA criticizes Tesla's Level 2 automation
The NHTSA expressed concerns over Tesla's Level 2 (L2) automation features, which disengage instead of allowing drivers to adjust their steering. The agency stated that this prevents drivers from staying involved in driving. Furthermore, the term "Autopilot" was criticized for misleadingly suggesting that drivers are not in control, contributing to the over-reliance on technology and loss of focus among users.
Tesla initiated a voluntary recall last year
In response to the safety concerns raised, Tesla initiated a voluntary recall last year, and released an over-the-air software update to enhance warnings for Autopilot. However, safety experts have questioned the adequacy of this update, leading NHTSA to launch a new investigation. Despite these findings, CEO Elon Musk remains confident about Tesla's future as an artificial intelligence company on the verge of debuting a fully autonomous vehicle for personal use.
'Tesla vehicles safer than those driven by humans'
Despite the ongoing investigations, Musk asserts that his vehicles are safer than those driven by humans. He plans to introduce a robotaxi later this year and stated, "If you've got, at scale, a statistically significant amount of data that shows conclusively that the autonomous car has, let's say, half the accident rate of a human-driven car, I think that's difficult to ignore."