Tesla's 'Full Self-Driving' Faces Safety Scrutiny After Fatal Crash

USASat Oct 19 2024
Tesla is in the hot seat again, this time with the National Highway Traffic Safety Administration (NHTSA) investigating their “Full Self-Driving” (FSD) system. The probe was sparked by a fatal collision involving a Tesla driver using FSD in conditions like fog or bright sun. The NHTSA wants to know if FSD can handle tough driving conditions and if similar crashes have happened. They're also looking into Tesla's software updates and their safety impact. The investigation covers around 2. 4 million Tesla vehicles in the U. S. that offer FSD, including models S, X, 3, Y, and the newly released Cybertruck. FSD is Tesla's paid premium feature, but it's been offered for free trials in the past. As of October 2024, the NHTSA had recorded 1, 399 incidents involving Tesla's driver assistance systems, with 31 fatalities. Tesla hasn't responded to requests for comment. CEO Elon Musk recently promised unsupervised FSD in Texas and California by next year. However, Tesla hasn't yet shown a car that can be safely driven on public roads without a human ready to take control.
https://localnews.ai/article/teslas-full-self-driving-faces-safety-scrutiny-after-fatal-crash-cceadba2

questions

    What if the Tesla FSD decided to 'play hide and seek' in the fog—how would that go?
    Is Tesla intentionally underreporting the number of FSD-related incidents to regulators?
    If Teslas could talk, what would they say about the fog they've encountered?

actions