Tesla to Face Lawsuit In Fatal Model X Autopilot Crash

APR 12 2018 BY STEVEN LOVEDAY 52

It comes as no surprise that the family of the late Walter Huang is pursuing legal action against Tesla for the recent deadly Model X Autopilot accident.

Huang’s lawyer Mike Fong insists that if Autopilot was not activated, the crash may not have occurred. He plans to wait to file a lawsuit until after the National Traffic Safety Board’s investigation is finished. His biggest concern seems to be that he believes Tesla’s statements attempt to blame Huang for the crash. He said:

Its sensors misread the painted lane lines on the road, and its braking system failed to detect a stationary object ahead.

Unfortunately, it appears that Tesla has tried to blame the victim here. It took him out of the lane that he was driving in, then it failed to break, then it drove him into this fixed concrete barrier. We believe this would’ve never happened had this Autopilot never been turned on.

Read Also: Tesla Says Autopilot Was Activated During Fatal Tesla Model X Crash

ABC7NEWS just recently aired an exclusive interview with Mr. Huang’s wife, Sevonne. In congruence with other reports, she said that her husband has been bothered by the Autopilot system’s function at other times prior to the accident. She explained that he noticed it trying to drive into the concrete barrier while the semi-autonomous system was activated.

Sevonne went so far as to say that her husband wanted her to take a ride with him so he could show her the issue. That never happened, however. Instead, she was watching the news report well-aware that it was him. Walter’s brother said that she could check the app to see his location, which confirmed her belief, not because the vehicle was showing at the crash site, but because the app was displaying an error.

The NTSB’s lengthy investigation is currently underway. According to Teslarati, recent talks between Tesla CEO Elon Musk and NTSB Chairman Robert Sumwalt have been constructive.

Tesla’s recent statement to the family via ABC7NEWS reads:

We are very sorry for the family’s loss.

According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.

The fundamental premise of both moral and legal liability is a broken promise, and there was none here. Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged. If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang’s drive that day.

We empathize with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive.

Sources: Teslarati, ABC7NEWS

Categories: Crashed EVs, Tesla

Tags: ,

Leave a Reply

52 Comments on "Tesla to Face Lawsuit In Fatal Model X Autopilot Crash"

newest oldest most voted

I am sure legally, Tesla is on firm ground. Huang signed an agreement that he was aware he was testing in development hardware and software, and was required to maintain vigilance at all times.

However, Tesla also promotes their Auto Pilot as a refined system and people being what they are become complacent when something works well in normal conditions. I do not agree that Tesla should concentrate on providing good lane keeping ability while at the same time providing poor crash avoidance. Tesla needs to get its priorities straight and put safety first before convenient features.

For this reason, I hope Tesla looses this suit, and they will adjust their priorities.

The reason this person died in the crash is because a driver a week earlier crashed into the barrier and took out the safety barriers, which which Cal Trans put at this dangerous spot. Cal trans never replaced them after the accident.

The site has markings that fooled a driver into driving into the barrier and then fooled another driver into driving into the barrier a week later while using autopilot.

Let’s not blame the persons who are responsible for ensuring that their cars do not crash, let’s not blame CalTrans for not replacing the barriers, let’s blame the entity that everyone agrees is not capable or designed to take responsibility for the safety of the car and it’s occupants.

Why do people hate to place the blame where it belongs?

Agreed!!

(⌐■_■) Trollnonymous

“Why do people hate to place the blame where it belongs?”

Because they have different agendas and no common sense.

Just follow instructions and you won’t die.

Dont Agree.
ModelX performed as designed. perfect.
but design assumptions have serious flaw, assuming US Road system is no1 and perfect.
Second every one follows rules. Their design is bad. @ leat car should have sensed the barricade and stop.
I dont switch on my modelx autopilot 70 % of time
it scares me.
And Tesla canot bully every one. Admit it and improve period

There a video , replicating the crash, once car cross underpass, SOLID white left lane goes across, and original lane become dotted, and worse faded, barely visible. AP followed left lane across and blasts into barrier.
Programming is not right . Tesla is @ fault plain and simple. AP blindly followed left lane across and caused the accident.

on top of it it did not sense the barrier, and stop.

Ap Should follow, right lane when in left most lane , and left lane when in right most, unless blink indicator is applied for lane change. its fairly system design issue.

AP1 in my model x orients to right line always unless there is a vehicle in the right line. Some times its little bit uncomfortable, so close to right vehicle, especially with big rigs

Some one reported AP2 orients towards left line.

@other who argue Driver is not attentive, why tesla is selling half baked product for 8 grand, They are selling it by saying 1000 extra if u buy later+1000 extra if u buy complete auto system later. its like coercing into half baked product

(⌐■_■) Trollnonymous

“There a video , replicating the crash,”

And what else did the video show????

It shows the driver being aware and taking action to not hit the wall and guess what? He lived!

To be clear, according to the news, the person who took out the barrier in the first place was an impaired drunk driver who hit it at night. I drive by the location almost every day and the only odd thing about the lane markings is there there’s quite a few pavement groves (e.g. asphalt intersecting concrete) that might have confused the system

You can’t blame the lack of barrier being available. What if some compressed it 3 hours before this crash? Would you say the city wasn’t fast enough? What about an hour before – you get my point. Your argument is a slippery slope.

I expect temporary orange traffic barrels in place until the system is replaced.

I used to drive by this spot daily. There are 2 HOV lanes and one becomes a left hand off ramp. I watched daily as a**hole drivers would blast up the left hand HOV lane and cross over last second into the right hand HOV lane to not be further behind.

This happened endlessly and I witnessed countless close calls and drivers who were crossing in the dead zone narrowly missing the crush barriers. It got so bad that I would take myself out of the right HOV lane before the split to avoid being hit by one of these drivers. I knew some bad crash was inevitable and didn’t want to be part of it.

I did not see the first crash but based on my experience I highly doubt the first driver was fooled and suspect it to be from some a**hole.

This makes a lot of sense. Suppose that the Auto Pilot was following a car that suddenly decided he was in the wrong lane and switched from the left to right HOV lane. AP, following the car could have followed it into the area in front of the barrier, then assumed the lane markings to be a proper lane. What I maintain to be unforgivable is that the car did not detect the barrier and attempt to stop.

Well I would hope the AP doesn’t blindly follow a car in front of it, what if they are drunk and swerving… or making a last second ‘oh sh**’ lane change which would leave the second car exposed to something… like a concrete wall.

What I bet happened is that the proximity of the two white lines confused AP and it suddenly opted to pick the right hand white line which would lead it directly into the barrier. There is a very long run in to this spot where the lines are very close and slowly diverge.

https://goo.gl/maps/1JRA77efzin

There is plenty of runway to make the change, runway that he drivers I refer to would use and cross over. But if AP suddenly said ‘uh oh we lost our line we need that right hand one’… well then you could see what would occur.

“Why do people hate to place the blame where it belongs?” Well said! When I was growing up, no one ever talked about “blaming the victim”. Now you hear it all the time, as our culture drifts away from personal responsibility, towards everybody playing the victim. “Blaming the victim” is a very emotionally biased accusation, suggesting that the victim is never to blame for what happens to him; suggesting that it’s always someone else’s fault. Adults are responsible for their own actions. If you don’t agree, then you don’t deserve to be treated like an adult; you deserve to be treated like a child. I don’t think it was a child who was driving this car. It was an adult, and if an accident happened as a result of his inattention, then he most definitely is responsible. Is that “blaming the victim”? Well, if you insist on putting it in such biased terms, then — You betcha it is! I’m blaming — or rather, assigning responsibility — to the victim, because the victim is responsible! Do semi-self driving cars encourage inattentive driving? Think of it as evolution in action. However, this is all being used by Tesla bashers to distract… Read more »

People are funny that way. They rarely want to take responsibility for their OWN actions, and more importantly, like to go after others.
For example, how often do you see ppl blaming the UAW for GM, Ford, Chrysler, AMC, etcs downfalls? Yet, UAW does not make the choices. Management does. BUT, the far right continues to blame UAW for downfall.
How often do you see GM, Ford, etc being investigated by NTSB? Not until there are a NUMBER of deaths caused by the same issue. BUT, Tesla is investigated on ALL deaths.

How many ppl are blaming Trump/Pence for being traitors? Any GOP blames FBI, Inteligence world (i.e. NSA), Obama, Clinton, etc. It is NOT trump’s responsibility for their actions. Yet, the NSA as well as Germany’s, along with Dutch’s, intelligence, caught Trump , Pence and Trump’s kids talking to the Russians ahead of the election.
The list goes on and on of blaming others for their own downfalls.
Sad.

Unfortunately, it will take government action for Tesla to make any significant changes to its current implementation of AP. Tesla sure as hell isn’t gonna do anything unless forced to. Very poor taste to continue blaming the deceased driver when the investigation isn’t close to being finished yet. No surprise though. We all know how Tesla attempts to control the narrative about itself as much as possible. Especially the IEVs staff. 😉

(⌐■_■) Trollnonymous

” Tesla sure as hell isn’t gonna do anything unless forced to.”

Taken from the playbook of GM on the ignition deaths right?
Then Tesla should lie about it just like GM did right?
Then they will take action when forced to by the investigations and law suits just like GM did right?

Exactly!
Suck on that Bro1999.

bro1999 said:

“Tesla sure as hell isn’t gonna do anything unless forced to. ”

I think we should rename you Mr. Hypocrisy1999, dude! The level of hypocrisy in your posts, such as this one, is thick enough to cut with a knife!

Reality check: Tesla is the auto maker which is very proactive in safety issues, and has repeatedly issued OTA updates and recalls to deal with anything that might possibly be a safety issue, even when that issue has not caused any injuries or accidents.

GM is the auto maker that knew about a defective ignition switch which was responsible for the deaths of dozens of people, but covered that up as long as possible.

Here is just one example of how proactive Tesla is about issuing recalls on its own, as soon as it realized there was a potential problem, without waiting for someone to get hurt, without anybody calling for them to issue a recall. And certainly long before any government agency “forced” them to do so:

https://insideevs.com/tesla-recalls-90000-model-s-electric-cars-possible-seatbelt-issue/

In fact, Mr. Hypocrisy, I challenge you to name one single example of the government forcing Tesla to make any change to cars already sold and on the road, whether related to safety or not.

Agreed. If Tesla were smart, they would do all they can to settle this thing out of court. If it goes to trial they will lose. The question is, will Elon’s competitive spirit allow them to avoid a full trial?

I disagree, all the evidence I see points to the man being negligent… #1 for not fully understanding the use agreement and #2 repeatedly not heeding to warnings from the vehicle… its a lapse of common sense and poor judgement. It is obious that there are still severe limitations on Autopilot and self-driving cars in general and all these companies are still getting data to improve it. Also, part of the fault for his death was on Caltrans/Federal Highway Administration for not restoring the crash barrier and not having clear lane markers. Frivolous cases like this will hinder the industry trying to bring these systems to market quickly… which ultimately will make auto travel much safer and save thousands of lives which most accidents result from human error. Lastly saying that self driving technology will always be 100% safe is a pipe dream. Self driving cars will always cause traffic fatalities but if it is 5000 people dying because of self driving cars vs 40000 people dying per year from human error, I’ll take the self driving car any day.

Herein lies a problem. What is the cause of most accidents? If it is human error, then it comes down to the skill and attention of the driver. Rightly or wrongly, I believe I am a better driver than most, and therefore am less inclined to believe that a system 4 times better that average, is necessarily better than me.

This may be so, but continuing the assumption that self driving cars are much safer in general, wouldn’t you want everyone else to be in self-driving cars? Doesn’t matter how skilled and attentive you are, it won’t prevent some drunk a**hole from running a red light and t-boning you to death.

“I believe I am a better driver than most, and therefore am less inclined to believe that a system 4 times better that average, is necessarily better than me.” That should only be an issue if someone is trying to force you to use an autonomous driving system, rather than driving the car yourself. Currently at least, that’s not the issue. The issue at hand, the only one really worthy of discussion here, is whether or not imperfect self-driving cars, or even semi-self driving cars such as a Tesla car controlled by Autopilot + AutoSteer, are safer than the average human driver. AutoSteer never gets drunk, never takes mind-altering drugs, never falls asleep at the wheel, and is never texting on its cell phone when it’s supposed to be driving. That alone goes a long way toward making it far safer than the majority of human drivers who are the cause of the majority of fatal car crashes! Most of the posts to InsideEVs on this subject (aside from the all too predictable, and ignorable, ones from serial Tesla bashers) merely reflect the very human trait of being afraid of giving up control. That is only going to disappear as… Read more »
(⌐■_■) Trollnonymous

“Huang’s lawyer Mike Fong insists that if Autopilot was not activated, the crash may not have occurred.”

If instructions were followed the crash would not have happened.
Aren’t the instructions to “keep your hands on the wheel and stay alert”????

Did he follow the instructions to “keep your hands on the wheel and stay alert”?

He has driven in that spot several times and has complained about it and even tested AP on it each time. On the fatal day did he “keep your hands on the wheel and stay alert”??? Or did he tempt fate??

Nobody says it but I blame the driver that event.

Making a poor decision (about trusting a semi-autonomous driving feature too much that’s advertised being 40% safer than a regular car) should not result in death.
Like the lady in AZ that decided to cross the street in the dark with her bike. Poor decision, but does that mean she deserved to get run over by the Uber AV and die?

(⌐■_■) Trollnonymous

Tesla doesn’t have AV so there’s no comparison duh. The uber was designed to not be driven by a human. The Tesla is designed to require a human driver……duh

Poor decisions are often the cause of a death.
Follow instructions and you won’t die.
That’s a simple common sense.

It’s kind of like the cop telling the bad guy to drop your weapon and put your hands over your head and the bad guy charges at the cop and he (bad guy) gets shot and dies…………but it’s the cops fault.

“Like the lady in AZ that decided to cross the street in the dark with her bike. Poor decision, but does that mean she deserved to get run over by the Uber AV and die?”

WOW, talk about changing the subject!

Hey, Mr. Hypocrisy1999, I’m sure you can figure out a way to blame Tesla Autopilot for that Uber accident. You can just BLAME AUTOPILOT for everything! 🙄

Dear reader, did you get pregnant? Just BLAME AUTOPILOT!

http://blameautopilot.com/

Tesla’s autopilot has an implied responsibility to keep the driver safe. It did not keep the driver safe in this case.

If, and when, autopilot gets confused as to where it should go, the failsafe option, to protect the driver, is to automatically slow the vehicle down to a speed in which the driver would survive a collision with a brick wall. In other words, what’s the maximum speed a driver in this model vehicle survive a collision with a brick wall? That’s the speed it should slow down to. It cannot rely on the driver to intervene. This needs to happen even if it is on the freeway. In this process, along with the warning tones, the car should brake, and hazard lights should start flashing.

If we were to believe Tesla’s claims, these types of events are rare, but the failsafe is necessary.

+1

100% agree!

(⌐■_■) Trollnonymous

“Tesla’s autopilot has an implied responsibility to keep the driver safe. It did not keep the driver safe in this case.”

The driver has the ultimate responsibility.
The driver did not respond to audible alerts.
The driver did not respond to visual cues.
The driver was speeding for pete’s sake. That stretch is 55MPH.
The driver knew the fault/defect in AP on that SPECIFIC stretch and encountered it several times before and continued to tempt fate.

Follow instructions and you won’t die.

“If, and when, autopilot gets confused as to where it should go, the failsafe option, to protect the driver, is to automatically slow the vehicle down to a speed in which the driver would survive a collision with a brick wall. Slowing down to less than the minimum speed limit on the freeway would not be a “failsafe”… it would be a “fail dangerous”! No, the failsafe for Tesla AutoSteer is to alert the driver, with flashing lights and a chime. If that fails, it will eventually pull over and stop the car. That wouldn’t have helped in this situation, though. It’s not so much that Autopilot was “confused”, as that it’s not designed to detect stationary obstacles. With the state of the art of the detectors in Tesla cars, there would be far too many false positives if it did. Low resolution Doppler radar and video cameras plus optical object recognition software are wholly inadequate for safe autonomous driving. Instead of the incremental improvements it’s making to driver assist features, Tesla needs to go back to the drawing board, and build a system that will produce a reliable SLAM, using either lidar or high-resolution radar, or both. A SLAM… Read more »

Sorry but I think autopilot was very confused, by the two very close together white lines. We will find out one day but if you look at the Google map link I posted you will see.

The 2 are very close for a very long time. This driver reported the car veering at this spot before, and I suspect it was trying to follow the right most line. When they start a few inches apart it’s not a big deal. If it decides to stay on the right line as the edge of the lane it would lead directly to what happened.

Then add on top of it that it didn’t recognize a stopped object and you get this. I suspect it didn’t recognize the object because it was so out of the ordinary to be inside a lane. From the head on view it is short and quite narrow, very unusual for a highway or really any road.

But what then if a truck drops something in the road, will it ignore that as chaff and run into it full speed?

Of course they are getting sued. I doubt it will go to a jury as that could drum up a lot of trade secrets. I suspect a big undisclosed settlement out of court in their favor.

The way Tesla keeps slamming the deceased driver saying he was solely to blame for the crash makes it more likely the family will see this case to court no matter how much money Tesla eventually throws at them to settle.

Straight from the GM playbook, eh Bro1999?
I hope GM is getting a refund on your lame FUD.

The NTSB just kicked Tesla off the investigation, so their relations aren’t looking very “constructive” anymore.

Tesla withdrew, but I think it was a case of quitting before getting fired.

Hey Bro have they replaced your defective battery that leaves people stranded on the side of the road even though it is still showing 40 plus miles in your bolt yet?

What was said by the deceased reportedly is irrelevant in our system of jurisprudence. Unless it was recorded. This is known as hear say evidence.
It will probably be found that the driver was at fault, due the number and frequency of warnings just prior to the accident, that were ignored..

This will be settled out of court.. not because of who’s fault it was but to protect Tesla themselves.

If this goes to court, there is a possibility of confidential information coming out into the public. Information such as Autopilot data. With this data, experts can analyze how often the user has had to intervene to override autopilot. If this percentage of overrides is high, this casts doubt on the system itself. It could mean trouble.

It seems to me that people are arguing that one and only one of the following must be true:

1) AutoPilot sucks.
2) Driver was negligent.

Why can’t we say both are true?

The Autopilot may have been deficient in this particular incidence, but the driver was clearly negligent. The warnings are there for a reason. A technology that results in 40% fewer crashes doesn’t meet my definition of sucks, but it certainly can be improved. Autopilot likely saves more lives than it takes.

The salient question becomes, does Level 1 through 4 automation weaken human resolve to intervene in semi-autonomous situations? Will the net gain in lives saved override the perception of flawed technology? Humans aren’t terribly rational creatures and I suspect this could be a problem going forward.

(⌐■_■) Trollnonymous

I totally agree both sides had their faults.
AP had issues many times on the same stretch as explained by the brother (inlaw?) in previous interview.
So AP sucks on that specific stretch. Just totally sucked and no reason to even try it with AP, don’t bother.

The human aspect just ignored everything about it and everything else.

But others seem to just want the finger pointed at one direction for their agenda.

So what was this guy doing while his vehicle was leaving the lane? Especially since he knows the route so well.

That is the big mystery to me. It makes no sense. If as his brother claims, he was aware of AP failings at that particular location, then you would expect him to be extra vigilant. Why would he ignore warnings at the known dangerous part of his route?

I would also like to know how long the AP was engaged, from previous to entering the highway? Was this a normal daily occurrence? I think he was driving to work and maybe pre-occupied with getting some work accomplished to get a head start on his day. Was he using a laptop computer, phone, or pad? With the extent of the damage, we will probably never know.

(⌐■_■) Trollnonymous

I hope Tesla eventually offers the option to not have any AP hardware crap on their cars.

I understand that you are convinced that you are such a safe driver that you’d never, ever need anything like AutoSteer.

But what about other drivers on the road? Wouldn’t everyone on the road — including you — be safer if other drivers had something like AutoSteer which they could turn on when they are driving drunk, or sleepy, or decide they just “have to” text on their cell phone?

If Tesla offered you the the option of buying one of their cars without AutoSteer installed, then that option would be available for other drivers, too. Are you sure that’s what you want?

There is no surprise that there is a lawsuit. There are real issues of fact and of liability that would need to be decided in court. The brother says the owner had taken it in for reported problems with AP, and Tesla combed their records and didn’t find any record of any complaint like that from the owner. Third parties unrelated to the case were able to duplicate the issue, but had plenty of time to respond and take control. The crash barrier had been publicly recorded on dash cams being collapsed for weeks, creating mitigating circumstances out of Tesla’s control an issue when it comes to the driver being able to survive the accident. These are all issues for a jury to decide based upon evidence. Or for Tesla and the owner’s insurance company to negotiate in a settlement with the relatives. (State of CA is immune from prosecution for their role with the crash barrier, but that doesn’t mean Tesla has to assume their liability financially, that is actually what the owner’s car insurance policy is for.) Since CA is a “comparative negligence” state, the plaintiff’s simply have to prove that Tesla contributed in some percent to the… Read more »

It is inevitable that the courts are going to have to sort out who is responsible, or what fraction of responsibility is assigned to whom, in cases of self-driving car accidents.

While I would never applaud a case of someone (or his survivors) refusing to accept responsibility for his own actions, trying to shift responsibility to someone else — as is the case here, since Tesla clearly and repeatedly warns drivers that it is their responsibility to remain aware and ready to take over from AutoSteer at any time — at the same time, since this issue has to be sorted out by the courts, better that happens sooner than later. It will be good for auto makers and insurance companies to have a clear understanding of who is liable in case of a fatal accident involving a self-driving car, or — in this case — a semi-self-driving car.

GM should have had Tesla’s legal department, Or Sergio Marchione’s Fiat-Chrysler-Automobiles Legal Dept. Neither company would have paid out 1 cent for supposed Ignition troubles on GM cars. They would have said: “The purchaser of any GM product agrees only to use GM authorized keys supplied to them on the key ring, and, should the driver have MORE than the Ignition switch/door key and Truck/glove box key (2 keys maximum) on the key ring – this will overload the key cylinder and the purchaser must agree to hold GM HARMLESS for any of their actions of putting more than the 2 GM supplied keys on the key ring. Also, oversized key rings generating large torques on the ignition switch cylinder are also prohibited and only the authorized GM supplied Key ring and 2 GM keys are allowed per vehicle…. “… To reiterate, no house keys, work keys, or any other keys not supplied by GM nor having anything to do with proper operation of the car shall ever be put on the same ring as the GM SUPPLIED trunk key and ignition switch key (2 keys maximum). GM CANNOT be held responsible for any damaging action the purchaser does on… Read more »