The National Highway Traffic Safety Administration is investigating Tesla’s Full Self-Driving software following reported collisions in low visibility conditions, raising concerns about safety and the technology’s reliability.
The National Highway Traffic Safety Administration (NHTSA) is actively investigating Tesla’s Full Self-Driving (FSD) software as it pertains to safety in low visibility conditions. This probe, announced on Friday, encompasses 2.4 million Tesla vehicles, ranging from the Model S and Model X (2016-2024), Model 3 (2017-2024), Model Y (2020-2024), and Cybertruck (2023-2024). The evaluation was precipitated by four reported collisions involving FSD operation under conditions including fog, sun glare, or airborne dust.
One of these incidents resulted in a tragic fatality where a pedestrian was struck, while another accident led to injuries. These events highlight concerns regarding how FSD-equipped vehicles respond to scenarios with impaired visibility. The NHTSA’s investigation marks an initial move, potentially leading to a recall if substantial safety risks are identified.
Despite its name, Tesla’s Full Self-Driving does not make the vehicle entirely autonomous. The software requires constant driver engagement and supervision, with a hands-on approach still needed. The company markets FSD as an optional software upgrade available for $8,000 or a monthly subscription of $199, and it is compatible across all Tesla models, notably the recently unveiled Cybertruck.
The government agency aims to determine if the FSD’s capabilities can adequately detect and adjust to poor visibility conditions. It also seeks to discover if recent system updates by Tesla might impact the software’s performance in such scenarios. Furthermore, the agency will assess the timing, purpose, and capabilities of these updates and consider Tesla’s evaluations regarding their safety implications.
Tesla, led by CEO Elon Musk, maintains that its self-driving systems are not a safety hazard. In rebutting a 2023 article by The Washington Post, Tesla contended that their technology has convincingly demonstrated its ability to reduce accidents. This defence is supported by internal data indicating fewer collisions when FSD is operational.
Notwithstanding these claims, Tesla’s self-driving technology has faced scrutiny, including involvement in prior fatal collisions. A notorious instance was a deadly accident in 2018, leading to a settlement by Tesla after contention over its role. Additionally, a 2019 crash was confirmed to have occurred just seconds after the vehicle’s Autopilot was activated.
Meanwhile, the automotive landscape is watching Tesla’s development in autonomous driving with keen interest. Tesla, which utilises a “camera-only” approach for its autonomous systems, could face challenges in maintaining visibility efficiency without auxiliary sensor technology. Other companies in the autonomous vehicle sector often rely on more comprehensive sensor arrays like lidar and radar to navigate varying conditions.
In December, Tesla preemptively recalled over two million vehicles in the United States to implement new safety measures within its Autopilot system, though the effectiveness of these measures is still under assessment by the NHTSA.
Elon Musk is doubling down on self-driving advancements, shifting the company’s focus towards this technology amidst competitive pressures and fluctuating demand in its traditional car business. The recent introduction of the “Cybercab” robotaxi concept—an autonomous vehicle without steering wheels or pedals unveiled at Tesla’s latest event—highlights this strategic pivot. However, such a vehicle will require regulatory approval to function without conventional driving controls.
Tesla has not yet commented on the investigation and its shares have seen a marginal decline in the wake of this announcement. The progression of this NHTSA inquiry may have a significant impact on the future deployment and regulation of Tesla’s FSD technology.
Source: Noah Wire Services


