The NHTSA has opened an investigation into the Tesla automated driving mode, called Full Self-Driving (FSD) to evaluate the automated driving mode’s capabilities in low visibility conditions.
The investigation was begun after at least four reports of crashes, including one from November 2023 in Rimrock, Arizona where a pedestrian was killed by a Tesla Model Y.
As reported by TechCrunch, the crashes all arose from road conditions where visibility was reduced:
“In these crashes, the reduced roadway visibility arose from conditions such as sun glare, fog or airborne dust. In one of the crashes, the Tesla vehicle fatally struck a pedestrian. One additional crash in these conditions involved a reported injury.”
The NHTSA’s preliminary evaluation is looking into several Tesla vehicles that feature FSD, including the 2016 – 2024 Model S and Model X, the 2017 – 2024 Model 3, 2020 – 2024 Model Y and the 2023-2024 Cybertruck.
According to the NHTSA report, the federal agency will look into three main areas:
- The ability of FSD’s engineering controls to detect and respond appropriately to reduced roadway visibility conditions
- Whether any other similar FSD crashes have occurred in reduced roadway visibility conditions and, if so, the contributing circumstances for those crashes
- Any updates or modifications from Tesla to the FSD system that may affect the performance of FSD in reduced roadway visibility conditions. In particular, this review will assess the timing, purpose, and capabilities of any such updates, as well as Tesla’s assessment of their safety impact.
The agency’s notice comes out a week after Elon Musk revealed the Tesla Cybercab, a driverless robotaxi that doesn’t feature a steering wheel or pedals. During that reveal, Musk did admit that, currently, Tesla does not have approval to run automated cars on public streets. This is why the reveal took place in a locked down Warner Bros. lot. Musk did not offer concrete details on how Tesla will get a approval to run without supervision.
It doesn’t help that Musk has insisted that the FSD’s camera-only system is better than sensor-based ones like those on the Waymo self-driving vehicles. As it is, Waymo uses a bevy of sensors including lidar, cameras and radar and, unlike Tesla, can operate on public roads without supervision.
The crashes that piqued the NHTSA’s interest include the November incident, a crash in January 2024 in California where a Model 3 hit another car on the highway in the midst of a dust storm, a March 2024 Model 3 crash in Virginia during a cloudy day, and an incident in May 2024 where another Model 3 hit a stationary object in Ohio during foggy conditions.
Typically, NHTSA investigations like this are completed within about eight months, so we should find out the results around June of 2025.
Other Tesla investigations
This isn’t even the first NHTSA investigation that Tesla has been involved with this year.
In April, the agency closed a three-year probe into Tesla’s Auotpilot, a driver assistance software offered by Tesla that has been involved in nearly 500 crashes. The investigation found 13 of those were fatal and that Tesla wasn’t tracking data properly or were only counting very specific narrow bands of crashes like only when air bags deployed. Techcrunch reported that the agency almost immediately opened a new investigation into the recall that Tesla issued to fix Autopilot.
The Department of Justice is also in the middle of an investigation about claims Tesla has made regarding FSD, while the California DMV has claimed that Tesla inflated the claims of the capabilities of their software.
It may be some time before Tesla is able to effectively and legally release a vehicle that can drive on public roads using their FSD system.
More from Tom’s Guide
Source link