Is Tesla's Full Self-Driving Software Too Risky?
USA, Rimrock, Nipton, Red Mills, CollinsvilleSat Oct 19 2024
Tesla's latest driver assistance system, called "Full Self-Driving (Supervised), " is now under scrutiny by the National Highway Traffic Safety Administration (NHTSA). Why? Four crashes occurred in low-visibility conditions, with one tragedy resulting in a pedestrian's death. NHTSA wants to know if Tesla's software can handle fog, sun glare, or dust. They suspect there might be more crashes they don't know about.
This investigation comes after Tesla CEO Elon Musk unveiled the "Cybercab, " a prototype robotaxi. He claims that by 2025, the Model 3 and Model Y could drive without supervision in California and Texas. Sounds ambitious, but how will it happen?
Back in April, NHTSA closed a probe into Tesla's older Autopilot system, which was involved in nearly 500 crashes. They found 13 fatalities. Instead of closing the book on Tesla, they opened another investigation into a recall fix for Autopilot.
Tesla's software is facing legal heat. The Department of Justice is looking into their driver-assistance feature claims. Plus, California's DMV thinks Tesla exaggerates the software's capabilities. Tesla even settled a lawsuit over an Autopilot crash that was supposed to go to trial.
The recent NHTSA probe focuses on four crashes from November 2023 to May 2024. In Rimrock, Arizona, a Model Y killed a pedestrian. In Nipton, California, a Model 3 crashed during a dust storm. In Red Mills, Virginia, a Model 3 crashed in cloudy conditions. And in Collinsville, Ohio, a Model 3 crashed in the fog, causing an injury.
NHTSA divides investigations into four levels: Defect Petition, Preliminary Evaluation, Recall Query, and Engineering Analysis. This probe is a preliminary evaluation, which NHTSA usually finishes in eight months.
https://localnews.ai/article/is-teslas-full-self-driving-software-too-risky-e092a062
continue reading...
questions
How does Tesla plan to improve the visibility detection capabilities of their Full Self-Driving software in response to the NHTSA investigation?
Is Tesla intentionally understating the capabilities of their Full Self-Driving software to avoid legal repercussions?
What are the ethical implications of releasing a self-driving system that cannot reliably detect and respond to low-visibility conditions?
actions
flag content