SP
BravenNow
Feds intensify investigation into Tesla’s Full Self-Driving (Supervised) software
| USA | technology | ✓ Verified - techcrunch.com

Feds intensify investigation into Tesla’s Full Self-Driving (Supervised) software

#Tesla #Full Self-Driving #investigation #federal regulators #autonomous vehicles #safety #software #Supervised

📌 Key Takeaways

  • Federal regulators are expanding their probe into Tesla's Full Self-Driving (Supervised) software.
  • The investigation focuses on the safety and performance of the advanced driver-assistance system.
  • This escalation follows previous scrutiny and incidents involving Tesla's autonomous features.
  • The outcome could impact regulatory standards for self-driving technology across the industry.

📖 Full Retelling

The National Highway Traffic Safety Administration has upgraded its probe after finding more instances of Tesla's driving software struggling in low-visibility conditions.

🏷️ Themes

Regulatory Scrutiny, Autonomous Vehicles

📚 Related People & Topics

Tesla

Topics referred to by the same term

Tesla most commonly refers to: Nikola Tesla (1856–1943), a Serbian-American electrical engineer and inventor Tesla, Inc., an American electric vehicle and clean energy company, formerly Tesla Motors, Inc.

View Profile → Wikipedia ↗

Entity Intersection Graph

Connections for Tesla:

🏢 Nvidia 11 shared
👤 Elon Musk 7 shared
🌐 Electric vehicle 6 shared
🌐 Apple 4 shared
🌐 China 3 shared
View full profile

Mentioned Entities

Tesla

Topics referred to by the same term

Deep Analysis

Why It Matters

This investigation matters because it directly impacts Tesla's core autonomous driving technology and could lead to regulatory restrictions or recalls affecting hundreds of thousands of vehicles. It affects Tesla owners who paid up to $12,000 for FSD capabilities, investors concerned about Tesla's valuation tied to autonomous technology leadership, and the broader autonomous vehicle industry that looks to Tesla as a market leader. The outcome could set important precedents for how regulators oversee increasingly advanced driver-assistance systems that blur the line between human and machine control.

Context & Background

  • The National Highway Traffic Safety Administration (NHTSA) opened its initial investigation into Tesla's Autopilot system in August 2021 following multiple crashes with emergency vehicles
  • Tesla's Full Self-Driving (FSD) software has been in 'beta' testing with customers since 2020 despite the name suggesting full autonomy
  • CEO Elon Musk has repeatedly promised fully autonomous Tesla vehicles for nearly a decade, with multiple missed timelines
  • The current 'FSD (Supervised)' version requires constant driver attention despite its name, creating potential confusion about capabilities
  • Previous NHTSA investigations have led to recalls of Tesla's Autopilot software in December 2023 affecting 2 million vehicles

What Happens Next

NHTSA will likely expand its data collection from Tesla and may issue preliminary findings within 3-6 months. Tesla could face pressure to modify FSD software features, implement additional safeguards, or potentially recall certain versions. The investigation may accelerate in the lead-up to Tesla's planned robotaxi unveiling on August 8, 2024, with regulators scrutinizing whether current FSD technology is sufficiently mature for such applications. Congressional hearings on autonomous vehicle regulation could reference these findings.

Frequently Asked Questions

What exactly is NHTSA investigating about FSD?

NHTSA is investigating whether Tesla's FSD (Supervised) software contains defects that pose unreasonable safety risks, particularly regarding its ability to respond to certain road situations and whether drivers properly understand its limitations. The investigation focuses on crashes, near-misses, and whether the system's capabilities match driver expectations given its 'Full Self-Driving' branding.

How could this affect current Tesla owners?

Current owners might see software updates that modify FSD behavior, potentially adding more restrictions or alerts. In a worst-case scenario, certain FSD features could be temporarily disabled via recall until fixes are implemented. Owners who paid for FSD won't receive refunds but might get improved versions once regulatory concerns are addressed.

Is Tesla's FSD actually fully self-driving?

No, despite the name, FSD (Supervised) requires constant driver supervision and isn't approved for unsupervised operation. The system is classified as a Level 2 driver-assistance system under SAE standards, meaning the human driver remains responsible for monitoring the environment and must be ready to take control immediately.

How does this compare to investigations of other automakers' systems?

NHTSA has investigated other driver-assistance systems like GM's Super Cruise and Ford's BlueCruise, but Tesla's investigation is notably broader due to its aggressive rollout approach and larger fleet size. Unlike Tesla, most competitors use more conservative naming (avoiding 'self-driving' terminology) and implement more robust driver monitoring systems.

Could this investigation stop Tesla from developing autonomous vehicles?

Unlikely, but it could force Tesla to slow its rollout, implement more safeguards, and rebrand its systems. The investigation may push Tesla toward clearer communication about system limitations and more gradual feature releases. However, Tesla will likely continue development while adapting to regulatory requirements.

}
Original Source
The U.S.’ top automotive safety regulator is notching up its investigation into the performance of Tesla’s Full Self-Driving driving software in low-visibility conditions. The National Highway Traffic Safety Administration’s Office of Defects Investigation said on Thursday that it has upgraded the probe it launched in October 2024 to what’s known as an “engineering analysis,” its highest level of scrutiny. It’s a step that is often required before the agency tells a company to issue a recall. This is one of two investigations that ODI is running on Tesla’s Full Self-Driving software. The regulator is also probing more than 80 instances in which Tesla’s driver-assistance software has violated basic traffic safety laws, like running red lights. The investigations come as Tesla has spent months trying to get a robotaxi service off the ground in Austin, Texas. ODI opened this particular probe after four reported crashes in low-visibility situations, one of which involved the death of a pedestrian. The regulator has spent the last year-and-a-half exchanging information with Tesla, and appears to have identified a handful more incidents where the company’s driving software proved insufficient in low-visibility settings. ODI also said on Thursday that it has not gotten all the information it wants from Tesla in this process. The investigative office wrote that, while Tesla began “developing an update” to fix the low-visibility problems in June 2024 — before the probe was even opened — the company has still not told ODI whether that fix was deployed, or which vehicles received it. ODI also believes there could be an under-reporting of similar crashes due to data collection and labeling limitations that Tesla reported to the safety agency. “In the crashes that ODI has reviewed, the system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred,” the age...
Read full article at source

Source

techcrunch.com

More from USA

News from Other Countries

🇬🇧 United Kingdom

🇺🇦 Ukraine