Reuters brought up interesting numbers about Special Crash Investigations (SCI). Performed by NHTSA (National Highway Traffic Safety Administration), they aim for “detailed investigations of emerging technologies.” Tesla would have 27 SCIs. Among those, 19 relate to crashes in which Autopilot could be engaged. Four of them were already concluded, among which one with Autopilot on. And it makes no difference for customers.
The first reason for that is that these federal investigations do not seem to focus on applying any sort of penalty to manufacturers. NHTSA selects “over 100 crashes per year” to study. According to NHTSA’s page about SCIs, they intend “to be an anecdotal data set useful for examining special crash circumstances or outcomes from an engineering perspective.”
Still according to NHTSA, “the benefit of the program is its ability to locate unique real-world crashes anywhere in the country and perform in-depth clinical investigations in a timely manner that can be used by the automotive safety community to improve the performance of its advanced safety systems.”
In other words, NHTSA wants to offer the data it gathers “to improve the safety performance of motor vehicles, notably passenger cars.” Regarding specifically “emerging technology, ”these anecdotal SCI cases are utilized by NHTSA and the automotive safety community to understand the real-world performance of emerging systems.” Not fixing issues, enforcing safety procedures, or preventing accidents, mind you.
Despite that, we have contacted NHTSA to try to learn more about these ongoing investigations on Tesla Autopilot. Only the finalized investigations are available on the NHTSA Crash Viewer website. Some of them are divided into “current” cases and some from 2004 up to 2015. We asked NHTSA why, but the agency declined to answer.
Trying to find information about Tesla there is a challenge. In the Current SCI Crash Viewer, you have to know they are published under “Other Domestic Manufacturers” in the “Search By Filters.” At the 2004-2015 SCI Crash Viewer, Tesla is included among the makes. You still have “Other Domestic Manufacturers” there.
The four cases that have been completed are listed under the codes CR13029, DS14020, CR15025, and CR16016. All of them were not fatal apart from the last one. Autopilot was suspected to be activated on the CR15025 case, but the driver didn’t even know it existed. It was definitely active on the CR16016 case, which you know as the crash that killed Joshua Brown on May 7, 2016, in Williston, Florida.
NHTSA’s investigation lasted one year and eight months in Brown’s crash, or 20 months. Among the other investigations, the one that took more time was DS14020: 11 months. We asked the agency why that case demanded so much time, and it also refused to comment.
The second and final reason for no one to expect anything from these 18 ongoing investigations on Autopilot is the conclusions about Brown’s crash. The report shows NHTSA verified the Tesla driver had enough time to brake or steer the car but didn’t. It also dismissed the idea that he was watching a DVD: the car had “no polymer disc cases, compact discs, or any other evidence of home videos or movies.” Despite that, he had a laptop, and a laptop mount placed so that he could use it at the driver's seat.
NHTSA also verified that “Autopilot and FCW were functional at the time of the crash” and that “the ADAS system did not respond to an impending crash event.” The report’s conclusion is this:
“(6) Regardless of the operational status of the Tesla’s ADAS technologies, the driver was still responsible for maintaining ultimate control of the vehicle. All evidence and data gathered concluded that the driver neglected to maintain complete control of the Tesla leading up to the crash.”
Despite NTSB’s (National Transportation Safety Board) warning that “the driver’s overreliance on the Autopilot” has caused accidents, NHTSA decided to side with Tesla’s disclaimer that the driver is responsible at all times.
Another element in Tesla’s defense is that Autopilot is still in beta testing. Anyone that decides to use it should know it may fail, such as it did in Brown’s case by not avoiding the crash against the semi-truck.
That said, future investigations about Autopilot, regardless of what happens, will probably point in the same direction and reach the same conclusions: if you decide to trust Autopilot more than you should, you are responsible for that, not Tesla. If that was NHTSA’s response for a fatal crash involving the technology, there’s no reason to expect it will be any different in the future.
We also asked NHTSA why it decided to investigate Tesla crashes in which Autopilot was not involved and which is the oldest investigation it has on the company's vehicles. All the agency replied was that it does not comment on open investigations.
Source: Reuters