Lawyers Chime In On Tesla’s Autopilot Crash Cases
It seems that lawyers agree that simply warning drivers to take over when the Autopilot fails would not hold up in a court of law. Just the name “Autopilot” suggests that the car is supposed to drive itself and the hands are expected to be free at times. Automotive liability lawyer, Lynn Shumway, explained to the Automotive News:
“The moment I saw Tesla calling it Autopilot, I thought it was a bad move. Just by the name, aren’t you telling people not to pay attention?’’
Three recent incidents have occurred with Autopilot “reportedly” engaged. Unfortunately, one was fatal.
Tesla has investigated the other two and found that in one incident Autopilot was, in fact, off. Musk went so far as to say that if the system was on, no accident would have occurred. In the other incident, the technology was being used on a two lane, non-divided highway, and the driver’s hands remained off the wheel. All of these circumstances are against Tesla’s “safe” operation of the system.
The latter is the root of the problem. People will not always follow the rules and will continue to use such technology beyond its means. Putting something mind blowing and potentially life altering in someone’s grasp and then telling them to limit its use, or use it with extreme care, or don’t test its limits, is difficult. Auto Lawyer, Tab Turner, said:
“There’s a concept in the legal profession called an attractive nuisance. These devices are much that way right now. They’re all trying to sell them as a wave of the future, but putting in fine print, ‘Don’t do anything but monitor it.’ It’s a dangerous concept. Warnings alone are never the answer to a design problem.”
People are lazy and most of us “know everything”. Some people purchase these vehicles with the Autopilot technology as a top contender in the deciding factor. They aren’t going to purchase it and then not use it, or use it rarely and with extreme caution. They might as well have opted for a different car. Regardless of any warnings, news, updates, restrictions, etc., people will continue to be people. Think of all of the product warnings that people ignore on a daily basis, with rarely a consequence.
If a case such as the Tesla Model S fatality were to go to court, Tesla could insist that drivers were warned and that in the end the driver is responsible. However, lawyers must only simply find an issue with the technology. If it can be proven that the system is defective, or could have worked better, or may have caused the accident, Tesla or any other company will have no leg to stand on.
Regarding the fatal accident in Florida, Tesla reported that the sensors failed to see the white trailer against the bright sky. Lawyers could argue that it surely should have noticed. Steve Van Gaasbeck, an auto products lawyer in San Antonio, Texas commented:
“It’s great technology, I hope they get this right and put people like us out of business. There’s really no excuse for missing an 18-wheeler.’’
Lawyers for accident victims and families could just aim to prove that the system’s name is misleading and that it doesn’t do as much as possible to explain and remind drivers of its limits. Or that it doesn’t “check” up on the driver’s level of engagement often enough or correctly.
The National Highway Traffic Safety Administration set a 5 point scale for autonomous levels in 2013. It ranges from zero to four, with zero being no automation and four being fully self-driving, with no human interaction.
Tesla’s vehicles are in the Level 2 to Level 3 range. It makes a big difference to lawyers and lawmakers what level a vehicle is or claims to be and what the expectations are of the vehicle as well as what is expected of the human driver.
Currently there are no set industry standards or guidelines from the U.S. government for autonomous vehicles. The process of establishing such rules or laws has begun and is supposed to be released very soon.