Realistic photo illustration of a Tesla car violating traffic rules at an intersection, under investigation by NHTSA officials, highlighting safety concerns with autonomous driving technology.

NHTSA investigates Tesla's Full Self-Driving software for traffic violations

AI द्वारा उत्पन्न छवि

The U.S. National Highway Traffic Safety Administration has launched its sixth investigation into Tesla's Full Self-Driving software following reports of dangerous traffic violations. The probe examines incidents including running red lights and driving in wrong lanes, which led to crashes and injuries. This comes amid Tesla's push toward robotaxis and unsupervised driving.

Last week, on October 7, 2025, the National Highway Traffic Safety Administration (NHTSA) announced an investigation into 2.88 million Tesla vehicles equipped with Full Self-Driving (FSD) or other driver assistance features. This marks at least the sixth such probe, focusing on dozens of complaints about unsafe behaviors, including vehicles blowing through red lights, veering into opposing lanes, crossing double-yellow lines, and making incorrect turns. One reported incident involved a Tesla approaching an intersection with a red signal, continuing through, and crashing into other vehicles.

Sources report varying details on crashes: CNN cited three accidents resulting in five injuries, while FOX 4 News mentioned at least six crashes with four injuries. NHTSA's Office of Defects Investigation noted 18 complaints where the software failed to stop at red lights, recognize signals, or provide warnings for maneuvers like sudden lane changes into oncoming traffic. The agency will assess whether Tesla offered drivers adequate opportunities to intervene.

Existing investigations into FSD and Autopilot, including fatal crashes, remain ongoing despite years of scrutiny. Bryant Walker Smith, a law and engineering professor at Stanford, described the process as 'regulatory whack-a-mole,' noting it takes a long time and aligns poorly with rapid tech advancements. Under the U.S. self-certification regime, automakers promise compliance with standards, but NHTSA lacks specific rules for advanced systems like FSD, limiting pre-market approvals.

Tesla classifies FSD as partially autonomous, requiring active driver supervision, as stated on its website: 'when enabled, your vehicle will drive you almost anywhere with your active supervision, requiring minimal intervention.' The company launched a robotaxi pilot in Austin, Texas, earlier in 2025, initially with an employee in the passenger seat, later moved to the driver's seat per local rules. CEO Elon Musk envisions fully driverless operations and a 'Cybercab' without steering or pedals. Tesla insists its tech is safer than human drivers but has not provided supporting data. Smith warned of risks from driver inattention: 'There’s great, great concern that humans... are going to lose attention if they are doing nothing but watching.' NHTSA could push for stricter standards, but legal changes from Congress are needed for pre-approval authority, unlikely soon due to auto industry influence and public tolerance of road fatalities.

यह वेबसाइट कुकीज़ का उपयोग करती है

हम अपनी साइट को बेहतर बनाने के लिए एनालिटिक्स के लिए कुकीज़ का उपयोग करते हैं। अधिक जानकारी के लिए हमारी गोपनीयता नीति पढ़ें।
अस्वीकार करें