US federal government opens probe into Tesla Autopilot crashes
Self-crashing technology
August 17, 2021
The US government has launched a formal probe into multiple crashes involving Tesla's Autopilot 'autonomous' tech.
The National Highway Traffic Safety Administration, part of the US Department of Transportation, will investigate 11 crashes of Tesla cars where first responder vehicles were present. Those crashes alone led to 17 injuries and one death, and all involved vehicles that had either the Autopilot or Traffic Aware Cruise Control modes switched on.
Deathrace 2021
“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first-responder vehicle lights, flares, an illuminated arrow board, and road cones,” the safety agency said.
The agency will look at all Models Y, X, S, and 3 sold in the US from 2014 to 2021 – around 765,000 cars in total.
The NHTSA has separately launched investigations into more than two dozen crashes involving Tesla cars and Autopilot. Eight of those crashes led to a total of 10 deaths.
The National Transportation Safety Board last year specifically called out Tesla's “ineffective monitoring of driver engagement” and said that it directly contributed to 2018 crash that killed Model X driver Wei Huang.
Despite not being a fully autonomous driving system, Tesla and its combative CEO Elon Musk have repeatedly refused to change the name of its driver assist feature, or its broader ‘Full Self-Driving’ package – both of which are neither autopilot-like nor fully self-driving.
Following a fatality back in 2016, Consumer Reports called on the company to change the name, arguing that it led drivers to think that it offered a complete self-driving system.
Tesla continued to brand the systems as Autopilot and Full Self-Driving around the world – everywhere other than Germany. In 2020, Munich’s Regional Court banned the use of both terms, saying that Tesla had improperly claimed its cars had full self-driving features.
Unlike self-driving companies like Waymo, Nuro, and Cruise, Tesla has also eschewed LIDAR – the (expensive) pulsed laser technology that allows for detailed 3D scanning of the car’s surroundings.
The company said it could achieve the same with cameras and radar sensors. Then, this year, it dropped radar sensors from Autopilot and claimed that eight 'Tesla Vision' cameras were enough.
The initial version of Autopilot ran without any human interaction. After several deaths, it was updated to occasionally require the driver to show they were alert by touching the wheel – but it only requires a slight nudge, and YouTube videos show people tricking the system.
In May, a man was arrested for driving east on Interstate 80 across the Bay Bridge, San Francisco – from the back seat of a Tesla. The offender, Param Sharma, said that he would do it again and added that Musk “really knows what he’s doing and I think people are just tripping and they’re scared.”
Social media is rife with videos of Tesla owners sleeping, reading, playing games, or watching movies at the wheel. There are also videos of cars getting confused and driving into oncoming traffic, or believing that the Moon is a traffic light.
“NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,” the agency said on Monday.
“Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for operation of their vehicles.”
About the Author
You May Also Like