Lately, we've heard of more crashes that might involve Autopilot than we would like to report. A guy in desperate need of attention was recently arrested sitting in the back of his Tesla, in a position similar to that in which William Varner's body was found in a Texas fatal crash. NHTSA is investigating that and apparently wants drivers to know what their cars can really do. Sadly, it did not address there’s no autonomous car in the market yet.
NHTSA asked Jason Fenske to help create the educational videos. The host of the Engineering Explained YouTube channel starts all five of these videos always in the same way: mentioning the agency does not want the viewer to get involved in crashes. As that’s a shared hope, Fenske said he made a partnership with NHTSA to clarify how some safety systems work.
Fenske mentioned blind spot intervention, lane-keeping assistance, rear automatic braking, automatic high beams, and driver assistance technology. This last video is where we expected to hear some words about how there is no software currently capable of driving your car or even drive it better than you, as some people think they can.
Autopilot and FSD unite many of these features in a single system. Yet, as Tesla itself told regulators and writes in its manuals, none of them is more than a Level 2 driving aid. Trying to prove they are or believing that is the case is what led to many crashes so far involving the system. Apart from the two new cases that have emerged, NHTSA was already investigating 18. The agency currently has 25 SCIs (Special Crash Investigations) ongoing.
If it is confirmed that Autopilot or FSD was active in the Texas and the Fontana crash, 20 of all 25 cases the agency is investigating will be related to the driving assistant aids Tesla offers. It could have been a good idea to mention that in any of these videos and stress what they really are. That could be a fantastic way to avoid new crashes.
Source: NHTSA via Automotive News