Tesla Presents Two Theories On Why Fatal Autopilot Crash Occurred

Tesla Autopilot

AUG 3 2016 BY ERIC LOVEDAY 38

Tesla Autopilot

Tesla Autopilot

Following the fatal Tesla Model S crash, various U.S. government agencies called upon Tesla to submit information and to testify before the Senate Commerce Committee.

Some early information is now leaking out from Tesla’s testimony. According to the automaker, there are two likely scenarios that could’ve led to the fatal crash.

Tesla isn’t fully aware of what caused the wreck, but the automaker does believe some sort of “system failure” caused the crash. As Automotive News explains:

“Tesla is considering whether the radar and camera input for the vehicle’s automatic emergency braking system failed to detect the truck trailer or the automatic braking system’s radar may have detected the trailer but discounted this input as part of a design to “tune out” structures such as bridges to avoid triggering false braking…”

That’s according to information obtained from a source on the congressional panel. Tesla wouldn’t comment on the discussions other than to say it was clear that the Model S’ cameras and radar did not directly cause the crash.

Previously reported findings include the National Transportation Safety Board report showing that the Model S was traveling 74 mph in a 65-mph zone at the time it struck the semi-truck and that Autopilot was engaged at this time of the crash.

Source: Automotive News

Categories: Crashed EVs, Tesla

Tags: , , , ,

Leave a Reply

38 Comments on "Tesla Presents Two Theories On Why Fatal Autopilot Crash Occurred"

newest oldest most voted

I want self driving cars for when I’m incapable of driving but the transition will be ugly. I’ll be happy to continue using “regular” cruise control until then.

My money is on the bridge theory.

Well in my opinion autopilot is more safe than human driver, basically because you have 3 high tech long range ladars that check evrything 10 times per second plus a human driver that can take the control anymoment. What can be better than that?

It should know where bridges are – they don’t just magically appear on a two lane road – another reason to have accurate road information.

Especially when other Teslas have driven down that road with Autopilot and the Autopilot never encountered a bridge/overpass at that intersection before, since Autopilot is supposed to learn and improve based on data collected from previously driving down that road.

True, though it should consider if the data is outdated, because things change. And in any way, it should be able to handle all situation, so the presence or lack of a bridge should have a fixed priority, but not be the only factor in the whole decision-making.

I think in any way, even if the accident is not their doing, Tesla should integrate a secondary way to detect higher obstacles than the bumper, like another radar of something. At least two accidents reported in the medias could have been prevented by that.

sven said:

“Especially when other Teslas have driven down that road with Autopilot and the Autopilot never encountered a bridge/overpass at that intersection before, since Autopilot is supposed to learn and improve based on data collected from previously driving down that road.”

This is a mind-boggling overestimation of what Autopilot/Autosteer is engineered to do, or is capable of.

As a reminder, when driving on a limited access freeway, Autopilot/Autosteer still occasionally sends the car onto an exit ramp, confusing that for the main highway.

Perhaps someday, autonomous cars will have sufficient “situational awareness” that they’ll be able to make steering and braking decisions based on stored data about where the car is and what’s around it. But speaking as a programmer, I think that day is quite a few years away.

Actually, “bridges” do magically appear over roads, but we usually call them advertising banners. I could even imagine a temporary pedestrian walkway being quickly installed over a road for a major event, such as the olympics. Or an art installation being installed over the road overnight. Or even a new bridge being installed (at some point it will not be there one day, then be there the next time cars are allowed through, even if it’s just a framework). Bottom line is that cars can’t completely rely on data from the passage of previous cars.

From the Automotive News article
“The source said Tesla also told committee staffers it views braking failure as separate and distinct from its ‘Autopilot’ function, which manages steering, changing lanes, and adjusting travel speed.”
http://www.autonews.com/article/20160729/OEM06/160729814/tesla-mulling-2-theories-to-explain-autopilot-crash-report-says

Elon Musk’s tweet after Automatic Emergency Braking stops a Tesla, preventing it from hitting a pedestrian:
“Autopilot prevents serious injury or death of a pedestrian in NY (owner anecdote confirmed by vehicle logs).”
http://insideevs.com/vehicle-logs-confirm-tesla-autopilot-prevented-possible-pedestrian-death/

Talk about spin! If Automatic Emergency Braking fails and there is a death, then AEB is NOT part of Autopilot. But if Automatic Emergency Braking works and prevents a death, then AEB is part of Autopilot.

Tesla can’t have it both ways. Either Automatic Emergency Braking is part of Autopilot or it’s not part of Autopilot, regardless of whether it failed or worked.

Braking failure, as in the brakes were not working, or braking failure, as in that the autopilot system failed to apply the brakes?

They are saying “system failure” which would mean something in the autopilot system failed or between the autopilot system and the braking system, in that it could not apply the brakes if it wanted to.

I don’t get the impression they are saying the autopilot system does not control the brakes, when that is obvious to everyone that it does.

What you’re implying is not what Tesla is implying.

Yes, it does look like Tesla is talking out of both sides of its mouth about whether or not the automatic emergency braking system is part of the Autopilot suite.

Whether or not it is, is obviously just a semantic difference; just a difference in labeling rather than in functionality. But Tesla should pick one or the other, and be consistent about it.

my theory: tesla knows that they are on the hook for major liability, so they want to spread out the liability risk. by introducing a legal theory the suggests that the brakes are the source problem, that means that they brake manufacturer gets pulled into any lawsuit. consequently, tesla would not be the only party getting sued, which increases the likelihood that at least part of the liability would be levied on the brake supplier.

so, of course, for legal defense purposes, tesla is going to want to separate the brake system from the rest of the autopilot system.

AEB and AutoPilot are two different systems on the Tesla. They both have the ability to engage the brakes independent of each other to do their jobs. AutoPilot is comprised of TACC and AutoSteer, while AEB is its own thing.

I go with the disregarding due to height of the object. Similar to the failure to detect a stationary platform loaded with pipes, that the summon feature when activated, slid the car forward into them. At that time, the time of the summons accident, I suggested they needed a more discerning addition, maybe another sensor for high obstructions. That is the basic evidentiary support for my belief.

I don’t think the two incidents result from the same cause, at all. The incident of a Tesla car parked closely behind a flatbed truck with protruding load, which under Summon drive the car forward until it collided from the load, appears to be a case of the truck’s load being placed too high to be included in the sensors’s cone of detection in front of the car.

Contrariwise, if the Model S Autopilot confused the trailer of an 18-wheeler tractor-trailer rig for a stationary sign, that’s not because the trailer was at the wrong angle for the sensors to “see” it!

The first was an apparent failure of hardware; the second an apparent failure of software. These are almost entirely unrelated things.

The Germans are going to install “black boxes” in ACC equipped cars. That will stop automobile manufactures from perjuring themselves in court when they present “cooked” crash data.

Letting Tesla or any other auto-drive manufacturer furnish their own data is like asking the fox to guard the hen house.

Only a FUDster would suggest Tesla, or any other auto maker, is faking data or committing perjury, without actually having any evidence this is so.

This sort of FUD is despicable, not to mention being libelous, and ought to result in the commentor being banned from posting at InsideEVs.

To Pushmi,

No one said Tesla committed perjury.

GM didn’t lie about the ignition switch and the tobacco companies didn’t lie about nicotine either.

I think the proposed German system makes a great deal of sense where a black box recorder is independently installed in every auto-pilot vehicle.

That would go a long way in preventing car companies from cooking results. Trust me this will happen. No one is going to take GM or Toyota or Volkswagen at their word.

The VW diesel scandal alone should convince you of that, especially now that the scandal has apparently expanded to other car makers from French investigations.

jmac said:

“No one said Tesla committed perjury.”

Dude, you posted “That will stop automobile manufactures from perjuring themselves…”

And in a post below, you said:

“Why would Tesla voluntarily turn over information to the legal system that would incriminate the company.

“Everyone is assuming that the information supplied by Tesla is correct.

“The Tesla worship is deep and wide.”

Don’t pee on our legs and then tell us it’s raining. If you’re gonna accuse a company of faking evidence and of perjury, then at least have the moral courage to admit that’s what you’re doing. Don’t make repeated, strongly stated, obvious insinuations and then pretend innocence when you’re called on it.

I agree with jmac. When an airliner crashes, it is the NTSB, not the airlines, that extracts the data out of the plane’s black boxes. When liability is at stake with large dollar damage rewards for victims, it would be foolhardy to trust an airline, airplane manufacturer, or automobile manufacturer to extract and decode the data from the black box of a crashed vehicle.

Pu-Pu you must have accused well over a dozen posters on InsideEVS of being FUDsters, paid shills, short sellers, Tesla haters, etc. Today, in the Tesla 2nd Qtr Earings article, you even accused long time poster pjwood1 of being a short seller. What the eff is wrong with you? Do you have mental/psychological problems? If you don’t, then you’re a little, little man. You must have gotten beaten up a lot in grade school if you had the same personality back then.

The commentor who should be banned from posting at InsideEVs is you, Pu-Pu, because you’re a jerk and a troll, not to mention InsideEVs’ resident cyberbully. 🙁

+1

I couldn’t have said it better.

Nobody should be banned from posting here. I don’t want InsideEVs to be an echo chamber of Musk worshippers! It’s the different opinions and points of view that make the comment section worth reading. Calling for people with alternative views to be banned is childish and shows that you are insecure about your own arguments and ability to speak for yourself.

The bottom line is the truck driver caused the crash, coupled with maybe an inattentive Model S driver. The AP/TACC and AEB systems all did not work in this situation, but it was an edge case, and those systems are all assistance systems with no guarantee of 100% correct operation. No doubt Tesla and the industry can learn from this though.

You are 100% correct. Your point is somehow being lost in all this talk of AutoPilot. The truck driver made a dangerous turn, lied about what he witnessed, and made impossible claims to justify his manslaughter. The Tesla driver was either asleep or playing with his phone (or somehow dead from a heart attack prior to the crash), because anyone actually paying attention would have seen a giant, slow-moving truck in front of him that could have been avoided.

But ultimately it’s the truck driver’s fault, since the Tesla had the right of way.

Anyone who uses AutoPilot as anything other than advanced cruise control is just asking for an accident. And that’s exactly what Tesla is saying.

If autopilot is too good at normal driving people will trust the system and do something different (read text messages more frequently, surf the internet, …). This is just how humans behave. You can not build a 99,7% perfect system and tell the human never trust this thing, be aware and check everything that the system does. This is getting boring pretty soon. So autopilot needs to check if the person is behaving as required in the manual and needs to alert the user/driver if he is not behaving as needed!

You only need a eye tracking camera and check for more than 10-20s off the road looking. Its not that hard…

+1000 unfortunately in our society always is somebody else fault, self responsibility is irrelevant.

What?

It sounds like you’re trying to make some kind of political point or tie in. Just come out and say what you mean.

You don’t think the truck driver is at fault for turning in front of the Tesla?

Cosmacelf said:

“The bottom line is the truck driver caused the crash, coupled with maybe an inattentive Model S driver.”

Thank you!

It just boggles my mind that so many people keep talking about Autopilot being responsible for this accident. No, the truck driver is legally, ethically, and morally responsible for the accident, altho there were contributing causes from the Model S driver speeding and, apparently, driving while distracted.

I’m glad the NHTSA is investigating the accident, but there’s no question that Autopilot could ever be found to be responsible for the accident, on either a logical basis, a common-sense basis, or a legal basis.

I still say timing of this accident was freaky. Had the Model S been 2 – 3 minutes ahead in traffic, it probably would have detected the front of the truck and stopped or avoided it.

Maybe if all moving vehicles (including motorcycles, buses, trucks and trains) had 50 yard radius beacons, collisions of this type could be greatly reduced.

NPNS! SBF!
Volt#671

Nelson said:

“I still say timing of this accident was freaky. Had the Model S been 2 – 3 minutes ahead in traffic, it probably would have detected the front of the truck and stopped or avoided it.”

Uh, no. Had the Model S been even 10 seconds ahead, it would already have passed the intersection where the tractor-trailer rig pulled out while making an illegal left turn, in which case whether or not Autopilot “saw” the trailer would have been irrelevant.

But you can say something similar about every accident involving two moving vehicles. They can collide only when their two locations in space converge at the exact moment in time they’re both there. Change any of those four variables, and the crash doesn’t happen.

Nothing “freaky” about that; it’s just happenstance.

Why would Tesla voluntarily turn over information to the legal system that would incriminate the company.

Everyone is assuming that the information supplied by Tesla is correct.

The Tesla worship is deep and wide.

Something is pretty deep here, all right. That something being the anti-Tesla FUD B.S. you’re shoveling out.

of course tesla isn’t going to hand anything over voluntarily. that’s because tesla has no obligation to represent the interests of any plaintiff – that is the responsibility of the attorney for the plaintiff. the plaintiff’s attorney would have to get a judge to order tesla to disclose its records.

Dude self driving cars it doesn’t exist yet back to Fox News

The point is Pushmi that no company wants to be forced to testify against itself, in fact the 5th Amendment precludes that.

Relying on records kept by GM, Toyota or Volkswagen is like asking the Mafia to volunteer how many cases of bootleg liquor they smuggled in from Canada during Prohibition.

Companies that install auto-pilot will lie to protect themselves from liability just as Volkswagen lied about diesel emissions.

It is really weird the extremes — especially with people that do not actually own a Tesla.

Let’s start with — it was an accident. Truck drivers fault.

The car had on cruise control. (Traffic Aware Cruise Control to be exact). The logs will show if the brakes were applied, automatically disengaging cruise control — like any car with cruise control. Because this does not seem to have happened the driver was distracted or unable to apply the brakes.

Automatic emergency braking (AEB) DOES NOT stop the vehicle. AEB slows the car down to about 25 miles per hour, according to the Owners Manual. So even of there was a system failure that “saw” a structure going across the road — appearing to be a bridge. A failure nonetheless.

I can’t believe trailers in the US don’t have barriers installed to prevent vehicles passing under. The Model S radar would have probably picked this up and braked. It’s a legal requirement in Europe.

Tesla staff members told congressional aides at an hour-long briefing on Thursday that they were still trying to understand the “system failure” that led to the crash, the source said. Joshua Brown was killed when his vehicle drove under the tractor-trailer. It was the first known fatality involving a Model S operating on the Autopilot system that takes control of steering and braking in certain conditions.