Federal safety regulators have launched a major investigation into Tesla’s Full Self‑Driving (FSD) system after receiving numerous reports of traffic violations, crashes, and injuries tied to its use. The National Highway Traffic Safety Administration (NHTSA) is reviewing 58 incidents in which vehicles equipped with FSD reportedly ran red lights or made lane changes into oncoming traffic, resulting in at least 14 crashes (including fires) and 23 injuries. The investigation encompasses nearly 2.9 million Tesla vehicles that have the FSD (Supervised and Beta) software installed. Regulators are especially focused on whether drivers were alerted in time, how well FSD recognizes traffic signals, and how it behaves approaching intersections or railroad crossings.
Critics warn that Tesla’s marketing and deployment of FSD may encourage misuse, positioning public roads as a de facto testing ground. Tesla insists that FSD is not fully autonomous and requires constant driver supervision, but many users say they experienced unsafe or unexpected vehicle behavior without warning. This new probe builds on earlier investigations into Tesla’s Autopilot, delayed crash reporting, and issues with features such as the “summon” function. Meanwhile, concerns mount over Tesla’s ambitious plan to deploy autonomous robotaxis in major cities, which now faces increased regulatory scrutiny.
Tesla’s stock dipped nearly 3 percent after the probe was announced, before recovering slightly by market close. Analysts and investors are watching closely: many believe Tesla’s ability to deliver on FSD — not hype — will determine confidence going forward.With accountability and oversight now in sharper focus, regulators, industry observers, and the public will be keen to see whether the investigation leads to recalls, stricter standards, or transformative changes in how advanced driver assistance systems are tested and governed.