The National Highway Traffic Safety Administration (NHTSA) is intensifying its scrutiny of Tesla’s Full Self-Driving (FSD) software, citing a significant increase in reported safety issues. A recent communication from the agency to Tesla details at least 80 instances where the system allegedly violated traffic laws, including failing to stop at red lights and making incorrect lane changes. This escalation comes as Tesla CEO Elon Musk makes controversial statements about the software’s capabilities.
The findings, revealed in a letter sent this week, represent a rise from the approximately 50 incidents NHTSA initially flagged when launching the investigation in October 2024. These reports stem from 62 complaints filed by Tesla drivers, 14 submissions directly from the automaker, and four accounts from media outlets. The investigation focuses on whether FSD accurately interprets and responds to traffic signals, signs, and lane markings.
NHTSA Investigation into Full Self-Driving Capabilities
The core of the NHTSA investigation centers on the potential for driver assistance systems to create unsafe conditions. The agency’s Office of Defects Investigation (ODI) is specifically examining if Tesla’s FSD provides adequate warnings to drivers when it encounters situations where intervention is required. This is crucial, as the system is currently labeled “Supervised,” meaning drivers are expected to remain attentive and ready to take control.
Growing Concerns Over System Reliability
The increase in reported incidents is particularly noteworthy because earlier complaints were concentrated around a single intersection in Joppa, Maryland. Tesla previously stated it had addressed the issues at that location. However, the new letter does not specify the geographic distribution of these latest reports, and Tesla frequently redacts details in its submissions to the agency. This lack of transparency has drawn criticism from safety advocates.
Meanwhile, the timing of the NHTSA letter coincides with a statement from Elon Musk on his social media platform, X, suggesting that the newest version of FSD will allow drivers to text while the system is engaged. This claim is widely considered illegal, as nearly all jurisdictions require drivers to maintain full attention on the road. NHTSA has not yet publicly responded to Musk’s assertion.
Data Requests and Scope of the Probe
The letter initiates a formal discovery process, outlining a comprehensive set of information requests directed at Tesla. NHTSA is seeking data on the total number of vehicles equipped with FSD, as well as the frequency with which the software is actively used by drivers. Additionally, the agency has requested all customer complaints related to FSD’s performance, including those originating from fleet operators and legal proceedings.
This isn’t the first time NHTSA has investigated Tesla’s driver assistance technology. A separate probe, initiated in October 2024, is focused on FSD’s performance in challenging visibility conditions, such as fog or bright sunlight. This broader investigation highlights the agency’s ongoing concerns about the safety and reliability of automated driving systems.
The development of autonomous driving technology, including Tesla’s Full Self-Driving system, has been a subject of intense debate. Proponents argue that these systems have the potential to significantly reduce traffic accidents caused by human error. However, critics emphasize the risks associated with imperfect technology and the potential for drivers to over-rely on automation.
The current investigation isn’t limited to just the reported incidents. NHTSA is also evaluating the effectiveness of Tesla’s system for detecting and responding to emergency vehicles. This element of the probe underscores the potentially life-threatening consequences of errors made by driver assistance systems.
Tesla has consistently defended its technology, stating that FSD has a strong safety record and that the reported incidents are isolated cases. The company maintains that its software is continually improving through over-the-air updates and data collection. However, the rising number of complaints and NHTSA’s formal investigation indicate that concerns remain.
Beyond NHTSA, regulatory bodies in other countries are also closely watching Tesla’s autonomous driving capabilities. The European Union, for example, is developing a comprehensive regulatory framework for automated vehicles, which could impact Tesla’s operations in the region. The outcome of these investigations and regulatory efforts will likely shape the future of self-driving technology globally.
Tesla has been given until January 19, 2026, to respond to NHTSA’s information requests. The agency will then analyze the data and determine whether further action is necessary, potentially including a recall or other corrective measures. The progress of the ODI probe, along with NHTSA’s reaction to Musk’s statements, will be key indicators of the regulatory landscape surrounding Tesla’s automated systems in the coming months. The investigation’s findings will be closely monitored by the automotive industry, safety organizations, and consumers alike.

