UPDATE: Tesla Says Autopilot Was Activated During Fatal Tesla Model X Crash

MAR 31 2018 BY DOMENICK YONEY 219

What might they reveal?

***UPDATE: Late last night, Tesla issued a statement after retrieving logs from the wrecked Model X. You’ll find the entire statement posted at the bottom of this page. The main findings include:

“In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

The fatal accident involving a Tesla Model X last week  continues to be investigated, and as time goes by, more seems to be on the line for the automaker. Tesla has seen its share price plunge this week — it hit a low of $249.22 on the 28th after opening at $311.25 on the 24th — not only because it seems more clear it will miss its downward-revised  Model 3 production target, but because the safety of its Autopilot driver assistance software is being called into question as a result of the tragic incident.

Tesla module being recovered from crashed Model X

It’s impossible to predict whether suitable answers will ever be found for the cause of the crash, but investigators have been able to recover a couple of modules they may provide some clues. California’s Highway Patrol Multidisciplinary Accident Investigation Team (MAIT) and National Transportation Safety Board (NTSB), working together, recovered both the restraint control module and infotainment module from the severely damaged Model X.

Investigators will work with Tesla to download and interpret the data in hopes of finding some indication of what went wrong in the moments leading to the accident. The automaker addressed the situation publicly in a blog post, blaming the severity of the damage on a crash attenuator that hadn’t been replaced following an accident at that same spot eleven days earlier.

ABC News 7 reports that the victim’s brother, Walter Huang, told investigators his sibling had complained about the vehicle’s behavior, saying it had previously veered toward that same barrier on seven to ten different occasions while Autopilot was engaged, and was the main reason he had brought his vehicle in for service recently. At that visit, he says, the company could not replicate the anomaly.

The CHP is aware of that damning claim and “has been acting on it for some time now.” For its part, Tesla has said it only has a record of a complaint about navigation, which is a different system. It also noted in its blog post that Tesla’s have traveled past that location 80,000 previous times without incident, and continue to at a rate of 200 times a day.

No doubt this story is far from over, but until facts can be ascertained — hopefully with help from these recovered modules or other clues — the cause of the accident is speculative. We will, of course, be following the extended aftermath of this tragedy.

Tesla statement below:

An Update on Last Week’s Accident

Since posting our first update, we have been working as quickly as possible to establish the facts of last week’s accident. Our hearts are with the family and friends who have been affected by this tragedy.

The safety of our customers is our top priority, which is why we are working closely with investigators to understand what happened, and what we can do to prevent this from happening in the future. After the logs from the computer inside the vehicle were recovered, we have more information about what may have happened.

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.

Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.

In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.

Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.

No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.

In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.

Source: Teslarati, ABC7

Categories: Crashed EVs, Tesla

Tags: ,

Leave a Reply

219 Comments on "UPDATE: Tesla Says Autopilot Was Activated During Fatal Tesla Model X Crash"

newest oldest most voted

Scary/Eerie stuff. Wait till some hacker sabotages these systems.

I want an EV that is not hackable, at least not wirelessly.

Why? All cars are crackable now. Plenty of situations on the ICE cars where they get owned.
And tesla is one of the most secured, if not the most secured, vehicle going.

No, all new cars are not able to be remotely/wirelessly controlled.

Correct. ANY computer can be “hacked” if you have physical access to it.

Yeah, no shiz sherlock. That is what I said wireless, bud.

If someone hacks into a normal car what are they going to do, turn your engine off? It would be annoying, maybe a bit dangerous, but probably not fatal. These autonomously driven cars are like straight out of a sci-fi movie. In the future the Russians won’t have to resort to using nerve toxins, they just have your car “accidentally” drive into a wall.

Many newer non-EVs still have electronic systems including steering. Additionally, since all the systems are electronic, there are other ways to carry out mischief than just driving into a way.

I think if I had nearly crashed into a concrete barrier several times on AP, I’d just start driving that portion myself.

Horses were better at autonomous mobility

Right?!

The guy decided to beta test software that controlled his velocity when traveling down the interstate at 80mph. You’re telling me that a few glitches will change his mind?

Yes it is odd. If the reports that he complained about AP not working on this stretch are true, it’s mystifying why he kept trying to use AP. I was thinking he might have been complaining about AP in that stretch in order to set up a suicide to look like an accident, but with AP engaged and without any hands on the steering wheel that thesis is untenable.

Wow. So much extreme assumptions going on with your absurd comment. I hope the family of this victim does not come in here and read that.

Yeah, seriously…

I think a more plausible explanation for why he would have been using AP after recent glitchy episodes is that he thought the issue had been resolved by Tesla with a recent OTA update to AP2.

Just a guess – but I think a more reasonable one.

Also, remember that Walter was an engineer. It is ‘in the DNA’ of most engineers to troubleshoot issues like these – especially regarding the capabilities of their own vehicle.

yeah i’d think if it had been such a problem, yet you’re still willing to beta test autopilot you’re going to be pretty switched on and ready to take back control. In my experience (which is limited), its pretty easy to wrestle control back

Right. If he was an engineer and it was his intention to test the new system at a point of potential trouble, then he would have been even more alert and ready to take over control at that point.

This crazy idea that the cause of the crash is that he was “just testing” the Autopilot update… well, it amazes me that even Tesla bashers would pretend to believe that. I’m literally chuckling out loud right now because the scenario is so ridiculous! 🙄

“This crazy idea that the cause of the crash is that he was “just testing” the Autopilot update… I’m literally chuckling out loud right now because the scenario is so ridiculous! 🙄”

Not sure about your use of quotation marks around “just testing”. Who exactly are you quoting?

My counter hypothesis (to someone here suggesting driver suicide) was that an engineer will keep testing and troubleshooting a perceived problem with their own vehicle – hence his reported 7 to 10 observations of swerving to Tesla. My separate hypothesis for why he would later let his guard down and not continue to remain vigilant was that he was satisfied that a recent OTA update had addressed the issue – presumably after additional post-update testing.

If you want a really good laugh, go back and review how you initially dismissed the possibility that Autopilot was engaged – misapplying Occam’s Razor no less.

Maybe, being a new software engineer and new owner of a trusted car, after being reassurred time aftrer time that the AP system was doing fine… he probably underestimated the danger.
Besides, first time a Tesla drives into a barrier and end up in a fatal fire

Factually incorrect.

According to previous reports, bystanders pulled the victim’s body from the wreckage before the car caught on fire.

Unlike gasmobile fires after an accident, a battery fire in a BEV takes several minutes to actually start spurting out flames.

Thanks James, for a sensible comment.

If Autopilot tried to steer me to my death ONCE then I’d never use the damn thing again.

WTF is with these Tesla owners who think their cars are already fully autonomous? Do they believe their cars have achieved self-awareness as well? Wasn’t the first death of a guy who fully trusted AP enough of a warning?

WTF is it with serial Tesla bashers coming up with the least likely reason for something, rather than the most likely, and posting about that as if it’s established fact?

It really amazes me that y’all seem to think readers here are stupid enough to believe such cabbage. O_o

What are you on about now, fanboi?

“I think if I had nearly crashed into a concrete barrier several times on AP, I’d just start driving that portion myself.”

Yeah, that jumped out as glaring discrepancy between the brother’s story and what happened in the accident.

If the driver knew there was a problem with AutoSteer at that point on the road, then why did he not take over driving there? It’s reasonable to guess that the driver was either very distracted, or asleep.

If he drove that route every day and AP had “only” failed seven times then there may have been a recent AP software update after which the car didn’t try to steer him to his death. Maybe he wrongly concluded the problem had been fixed and it was time to take a nap.

It seems callous to attack the victim’s survivors as a way to absolve Tesla of any responsibility for the crash, but I guess fanbois will be fanbois.

If this claim is true why in the heck was he continuing to use AP??? Swerving once or twice would have been enough for me but 7 to 10 times!!

Way to go on blaiming the dead victim!

It is his brother that is making this claim.

” Walter Huang, told investigators his sibling had complained about the vehicle’s behavior, saying it had previously veered toward that same barrier on seven to ten different occasions while Autopilot was engaged,”

And I agree with the statement that it makes no sense that if true, he wouldn’t be extra cautious at that time. So I think the brother’s claim is false.

It might make complete sense. People had been talking up the recent autopilot update a lot. Talking about how much better it now performed in extreme situations, poorly marked lanes, construction, etc.

Hs possibly could have received the update and wanted to give it a chance.

You can easily test it with your hands on the wheel, though. There’s no reason hands should be off the wheel, even if AP is on.

I agree, he shouldn’t have removed his hands. Autopilot isn’t ‘hands free’.

But it doesn’t take long to find people on the internet looking to cheat autopilot. Look all over TMC, reddit, Youtube…

And if a person is feeling reasonably comfortable about the update and they need to reach down to grab something or look at their phone for a few seconds, Autopilot doesn’t do much to stop them. If they ignore the text warnings, the audio warnings do not immediately sound.

I dunno, as I’ve said before, I’m just skeptical about these semi-autonomous systems from all manufacturers. It’s cool, but I don’t particularly want it until it is fully autonomous. 😉

Technically, cheating is going over 55mph. I may not be the most well read on how AP works, but understand the steering torque is limited from the system. IOW, AP never wrenches the wheel away from you, but instead veers.

Just like there is a specific torque spec to GM’s ignition switches not turning by themselves (or by heavy key ring), there is a max torque spec to AP Autosteer which makes Tesla’s “5 seconds” comment relevant. Still, I don’t remember another accident where Tesla took the defense of how long someone had a sightline, to a stationary object.

I doubt enough Tesla owners are aware their cars do a much better job adjusting for moving objects. It’s counter-intuitive.

The whole “hands on wheel” thing isn’t really an excuse. Tesla has pushed the AP thing way too far. Plenty of people put more faith in the system than is warranted. And even if you discount what happens to them, they also could be a danger to others on the road.

If this car just blatantly drove into a barricade with no other cause then that’s pretty damning.

Tesla states: “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.” Tesla is trying to mislead folks into concluding that the driver was not holding the wheel, and that there was an active alarm in the final moments telling the driver to do so. Here is my take on this misleading statement: (1) The statement regarding an audible hands-on warning was for “earlier in the drive” – not during the critical final few seconds. If there had in fact been an active audible warning in the final few critical seconds when it could have made a huge difference, Tesla would have more clearly indicated this in the statement. Instead, we get this “earlier in the drive” statement. (2) Autopilot detects the driver’s hands on the wheel by a means of feedback in which the driver must momentarily jerk the wheel to provide the necessary feedback to AP to reset the warning and warning timer. The driver was likely holding the wheel but had not provided this ‘jerk’ feedback for at least six seconds because he was letting… Read more »

Wow, just how far down that rabbit hole you went!

Reality check: If the driver had been holding the wheel, then he would have been in control of the steering and would be responsible for the accident.

Honestly, sometimes the screwball mental gymnastics of Tesla Hater cultists, in making “pretzel logic” arguments, wind up twisting things around so far that they bite themselves in the back!

Downright hilarious!
😀 😀 😀

“Reality check: If the driver had been holding the wheel, then he would have been in control of the steering and would be responsible for the accident.”

I respectfully disagree. Tesla advises its customers to hold the wheel when using Autopilot. If holding the wheel (so as to be in the best position to take over from AP) equates to being “in control”, then honestly – what is the point of Autopilot?

Yeah, that was my first thought: That the brother just made that up to support his planned lawsuit.

If AutoSteer had repeatedly tried to steer the car into a concrete barrier at the same place several times, any reasonable person wouldn’t merely complain about it to a relative; he’d stop using AutoSteer, at least during that portion of the drive.

I said that yesterday, sorry but after the 2 serve, im taking the car in

Available evidence suggests the accident victim had irrational faith in his car’s AP. No rational human would put his life at the mercy of a system that had previously tried to kill him, not once or twice, but seven to ten times.

Now, people are going to think navigation and autopilot form a lethal link, when there is no customer beta going on for a level 5 prototype.

I personally stick to well proven level 1 tech like emergency braking and lane warnings where the human is required to be in control. When Level 5 is reliable, I’ll happily switch over. That likely won’t happen till well into the 2020s.

And, even in the future with Level 5 autonomy, you are going to need roads with all the necessary paint markings, etc, to be able to have enough redundancy in the tech, to absolutely guarantee your safety, in all adverse conditions and situations.

It would seem that an Autonomous Freeway Drive from LAX, to NY – Kennedy, for Example, will be happening about 10 years before a Drive from Key West to Anchorage, via the Alaska Highway!

Agreed, but with our Second World grade infrastructure here in the US, it’s going to be a challenge. You have to travel abroad to realize how bad our roads are, and better they could be.

I agree with this. Level 1 stuff can only make the car safer (they’ll kick in during a situation where maybe you missed something). Once you start relying on the car to drive itself, you stop paying attention and then if the system fails you’re in trouble.

I don’t care whether you’re technically supposed to keep your hands on the wheel. Even if you do, and even if your eyes on the road, if you’re relying on the system you’re not totally engaged and you’ll be slower to react.

That’s absolutely correct.

Exactly.

Level 5 AV is much farther away than most people realize. It’s one thing for a car to drive itself in sunny Silicon Valley, but quite another to drive on typical US roads in the snow belt. It also sets up situations in which drivers will not only be inattentive but also out of practice driving. If a car drives itself all summer long but then on bad winter roads it suddenly demands the driver’s action to avoid danger, then how prepared will that driver be?

This is why I am not going to pay for Autopilot option on my Model 3 when it is time for me to confirm it.

I don’t want to pay $5K to be beta tester/crash dummy.

(⌐■_■) Trollnonymous

I prefer they sell versions without all the AP crap. That would be the version I buy.
Oh yeah, and don’t make it a necessary/forced upgrade to be able to get other packages either like AWD.

that was an initial complaint of mine; unless AP utilizes hardware need be present anyway then one would have to assume it’s baked into the price.

That’s certainly an option, but you should keep in mind AP is owner’s favorite feature for a reason.

because wealthy people can never have enough toys?

No, because AutoSteer (not “AP”) makes driving both easier and safer.

One single accident, no matter how horrible it is, doesn’t change the fact that Tesla cars with AutoSteer installed have a ~40% lower accident rate than those without AutoSteer.

” and was the main reason he had brought his vehicle in for service recently.”

It should be easy enough to confirm whether he had recently been into Tesla for service, and whether that was the reason for having it serviced.

Very disappointing. Autopilot may not be able to stop every accident, but driving into a concrete barrier at full speed should not ever happen.

+1

Major fail on Tesla’s part.

For all those blaming Tesla, the driver was alerted repeatedly to drive and had five seconds to grab the wheel. This is sad, but I’ve heard Tesla drivers try to override the driver detection system by hanging a weight on the steering wheel or jamming something on it. My guess on the reason for the crash is that the barrier distance moved by someone hitting it and that messed up the data pattern for the area. If you have a section mapped at a certain distance and all of the sudden it moves on you, then the pattern recognition gets out of whack. This sort of “expert system” approach to AI is where the problem lies. The vehicle is data dependent. Rather than knowing what a barrier looks like, it relies on massive quantities of maps and scanning data. It is an irregular shape with a scrolling pattern on it, that probably confuses the sonics or radar. Hey Elon: start using the stereoscopic imaging to determine what is an immovable object. Do it over a time period. Then you could build a track on private land with all kinds of objects and shapes and it would always get it right.… Read more »

That doesn’t explain why other Teslas aren’t haven’t the same issue, according to Tesla.

It could be caused by anything: slightly different lighting conditions, slightly different traffic level, slightly different position in lane. All came together to cause a slight glitch and the car plowed right into the barricade. It could even have been a computer glitch for all we know. I’m sure I’m not the only person who’s experienced an occasional program crash on an otherwise stable computer. Fortunately when an office program crashes, it doesn’t lead to a physical crash.

Not even remotely an excuse. A system where people are lulled into a false sense of complacency is dangerous, regardless of whether there are ineffective warnings to hold the wheel.

There’s a reason other companies haven’t been as aggressive as Tesla in rolling out their self driving systems: it’s to make sure stuff like this doesn’t happen.

“For all those blaming Tesla, the driver was alerted repeatedly to drive and had five seconds to grab the wheel”

I would be more likely to agree with you if I believed that the car’s warning system was actively warning the driver with 5 seconds to spare to make a correction (not that this is a ton of time). Unfortunately, Tesla’s statement refers to warnings “earlier in the drive” and obfuscates the warning status in the final critical seconds with its misleading “six seconds” reference. I strongly suspect that there was no active warning for the driver in the fateful final seconds. Tesla needs to clarify their statement. It is intentionally misleading in my opinion.

“Hey Elon: start using the stereoscopic imaging to determine what is an immovable object.”

Software-based stereoscopic optical object recognition simply isn’t reliable enough for Level 4/5 autonomy.

Fully autonomous cars need real-time 360° active scanning, day or night, using lidar or high-resolution radar. That is what is needed to reliably avoid colliding with obstacles, whether those obstacles are fixed concrete barriers or other cars on the road or pedestrians.

I guess what you’re suggesting then is that the driver should have been paying attention and holding the steering wheel – as Tesla so often state. . .

That canned line isn’t an excuse because the nature of these self driving systems is to lull the driver into disengagement. Even if you keep your hands on the wheel and eyes on the road, you won’t pay as much attention since you’re not in complete control. And the longer you use them and so far haven’t had problems the more complacent you become, so when a glitch finally occurs you’re looking down at the radio or distracted by something else.

Frankly these systems shouldn’t even be rolled out until they are absolutely 100% trustworthy and capable of acting on their own. Having an “autonomous” system where the driver needs to remain fully engaged is worse than not having one at all.

“Frankly these systems shouldn’t even be rolled out until they are absolutely 100% trustworthy and capable of acting on their own.”

No, a thousand times no!

Air bag systems are not “absolutely 100% trustworthy”. Should we all turn ours off? No! The proper question is this: Are you safer with the system, or without it?

According to the NHTSA, in a Tesla car you’re considerably safer using AutoSteer than not using it. One single accident doesn’t change that reality, any more than 20 fatalities from exploding air bags changes the reality that you’re safer using air bags than not.

Human beings are not very good at judging relative risk, and comments like the one above are reflection of that.

You are confident that he was not holding the steering wheel based on Tesla’s statement?

Dude, you don’t know when to stop with the Tesla hater FUD, do you?

If the driver was holding the wheel, then he — and not AutoSteer — was responsible for the crash.

Duh!

I also note the failure on your part to acknowledge you were wrong about your insinuation that if AutoSteer was in control of the car, Tesla would never admit it.

People need to understand that autonomous or autopilot crashes will be different than human ones. This is why they need to be so much more reliable. Even with fewer crashes, the ones that happen may be “dumb” for humans. Favorable statistics won’t compensate this, unless they are extremely favorable.

1. That same barrier had previously been crashed into, almost certainly by a human-driven car. That’s why the barrier was collapsed. Suggesting that this kind of accident would be unusual for a human-controlled car simply doesn’t pass a reality check.

2. I dunno about you, but I think the NHTSA’s report that the ~40% lower accident rate by Tesla cars with AutoSteer installed, vs. not installed, is pretty significant, and I don’t see how any reasonable person could avoid concluding that by now, AutoSteer must have saved more lives than it has cost.

Avoidance of accidents doesn’t get reported very often.

R’ught R’ohh

That’s not going to be a good thing for them.

But again if it does it over and over and over and over (7 to 10) times why would you have it on there yet again!?!?

Horses were better at autonomous mobility

I can see how it could happen. There is a lot of hype over Tesla’s OTA updates and ability of their system to learn. So if you kept having to correct it, based on the hype, you would expect it to learn. And possibly the driver got the new update?

Horses,
I speculated something similar in a different thread yesterday:
“IMO, If he let his guard down and trusted AP at that point, it would have only been because he had concluded that Tesla had successfully resolved the issue – perhaps through a recent OTA update.”

Sure, this system nearly steered me into a concrete wall at 75 mph, and it did so over and over again, but this software update will finally fix it! I’ll watch YouTube while the new AP takes care of me…BOOM! BODY RIPS IN TWO HALVES.

I’m continually astonished at how people who can afford Teslas can be so mind-numbingly stupid. It’s yet more proof of the weak correlation between money and brains.

…or maybe it’s a rather strong indication that the entire ridiculous argument is based on something that’s not true.

Duh.

What’s not true, fanboi? Let me guess, you think the brother is lying? Because he’s a “serial Tesla basher,” I suppose?

Sure, that makes sense. The brother is going to lie in order to portray his just deceased brother in the worst light possible. Your fanboism has clouded your mind.

“We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.”

Nice round number. Sounds really good. Plays really bad, IMO. Tesla has no eye following hardware in cars. They can make the beeps “10 times” more frequent, and annoying, though. I hope that’s enough, if it comes to it.

Lots of press, this evening.

pjwood1,
Using ‘eye following hardware’ is actually a really good idea. The combination of an attentive driver and the current AP (level 2 and beta that it is) is still statistically safer than no AP.

Would eye-following hardware have saved the day here? It’s hard to tell. But at least it would put an end to the idiot Tesla owners being able to engage AP while sleeping, sitting in passenger seat, sitting in rear seat, playing cards, smoking dope, etc. Its required use on the Uber test vehicle in Arizona would have very likely prevented that tragedy a well.

Perhaps it is time for regulators to start requiring some form of ‘driver attention monitoring’ for all semiautonomous vehicles on public roads. When we get to full autonomy (level 5), the requirement could be eliminated.

I think Supercruise works using eye tracking to make sure you are completely focused on the road.

But still, I don’t see the point of either system. I am already stuck behind the wheel. If I can’t trust it enough to look away for 5 seconds, then I should probably remain completely focused and do the driving myself. I’d be less bored driving in traffic than watching intently as my car drives in traffic.

According to Tesla it reduces your chance of death and physical harm significantly, which is the point of having all this automation. Isn’t that worth doing?

That’s what they claim. I’d say it’s not clear that it’s actually safer than a manually driven car with driver assistance safety features. Most of the things that improve safety don’t require giving up full control of the car.

How about disengage ap when hands are not on the wheel like propilot

> “We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.”

That has to be validated by an external impartial lab, and it has not.

As far as I’m concerned, just treat that as a marketing blurb, with no weight on it.

Yes. Excepting statistics.

Fair enough, but I think the Gov’t should mandate a series of tests for the 5 levels of assisted/autonomous driving.

Going forward, it will help the manufacturers, and build confidence by the public.

Right now, it a bit of wild west, with all kinds of invalidated claims.

As far as I’m concerned we should treat brand new usernames who just suddenly showed up on InsideEvs to bash Tesla on an important Tesla thread as what they almost certainly are:

Existing trolls who re-registered a new username to lamely try to appear to be multiple people spouting the same anti-Tesla FUD or “Concerns” as the tactic of concern trolling.

Take your Tesla fanboy glasses off.

So many industries are tested by outside labs for validation, and what I suggested is good for the industry and for Tesla.

Cause Telsa can hold up the validation to the whole world, and say, we passed this test.

Oh noooooo, someone actually brought up a good valid criticism of my precious Tesla! Now I’m going to call him a shorter/hater etc., because I lack maturity. Grow up baby.

My réponse to your MULTIPLE Tesla bashing threads was that as a brand new username you are probably a troll.

I have no problem with the Govt or some independent lab (not one owned, controlled, or influenced by the legacy auto OEMs) testing the various Levels 2-5 systems coming out.

Yeah, and based on your knee jerk responses to my post, my guess that you’re a Tesla fanboy, where Tesla can do no wrong.

How dare anyone criticize my precious Tesla. Boo hoo!

Maybe if you quit the name calling, we can have a civil discussions around here, as you can see, name calling, just invites name calling.

> MULTIPLE Tesla bashing threads

Also you need glasses. There are no “MULTIPLE Tesla bashing threads” by me.

Show me them?

pjwood1 said:

“[quote] We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars. [unquote]

“Nice round number. Sounds really good. Plays really bad, IMO.”

I’m not sure whether this comment was posted before or after the update including Tesla’s lengthy press release about the incident, but the latter included this:

“If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.”

Tesla is not claiming that Tesla cars equipped with Autopilot + Autosteer are “autonomous”, any more than equipping an airplane with an autopilot makes the plane autonomous.

According to those who rate such things, Tesla’s system is only Level 2 autonomy. A fully autonomous car would be Level 4 or Level 5.

https://www.caranddriver.com/features/path-to-autonomy-self-driving-car-levels-0-to-5-explained-feature

(⌐■_■) Trollnonymous

“Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken. ”

And…

“the victim’s brother, Walter Huang, told investigators his sibling had complained about the vehicle’s behavior, saying it had previously veered toward that same barrier on seven to ten different occasions while Autopilot was engaged,”

IMHO, This sounds like he was testing AP to the bleeding edge/threshold.

I wonder if the logs indicate that AP was going to slowly disengage? I think that’s what it is supposed to do when it senses the driver does not have their hands on the wheel for an x amount of seconds.

I don’t buy that he was “testing AP to the bleeding edge” for a second. Tesla’s carefully crafted statement seemingly links the warnings “earlier in the drive” with what happened in the final seconds. I think that linkage is misleading – especially if the noted alarms occurred several minutes earlier. We are left with the distinct possibility that Autosteer steered into the gore area – mistaking it as a travel lane due to its unusual width and parallel nature. The driver likely had no alarms during the final seconds – otherwise Tesla would have more explicitly and clearly stated so instead of mentioning warnings that occurred “earlier in the drive”. To me, it looks like AP made a lane recognition mistake that put the car into the gore area several seconds before impact. Once the faulty lane change occurred into the gore area, the “driver had about five seconds and 150 meters of unobstructed view of the concrete divider.” These words seem to confirm the car was in the gore area – not a legitimate travel lane behind other vehicles. Was there an alarm at that time to warn the driver of the urgent need to act? I would say… Read more »

My feelings exactly. How long does the AP stay engaged after warning the driver to grab the steering wheel?

Not an owner yet, but have test driven the Model S and X. Owners might know more about the current implementation

From what I recall there is a significant amount of time you can let go of the wheel before the audible warning goes off. Initially the car only gives a ‘put hands on wheel’ icon on the dash. It’s tiny – if you have looked away, there is no chance of seeing it.

If an audible warning was going off 5 seconds prior to the crash, Tesla would have said so in the blog. I don’t recall how long it takes for the audible warning.

https://forums.tesla.com/forum/forums/how-much-hands-wheel-time-needed

From Tesla Driver last year: “I think it’s about every 2 minutes. I just shake the wheel or slightly turn it to one side, but not enough to disengage it. It takes be about 1/10 of a second and then do it again when the screen starts flashing again. I’ve only had it give me a audio warning once because I pay enough attention to notice when it starts flashing. One should always be paying attention, autopilot isn’t perfect yet.”

“I don’t buy that he was ‘testing AP to the bleeding edge’ for a second.”

No reasonable person would “buy” that argument. Doubly so if the driver was an engineer, as some comments here claim!

If, as the brother claims, AutoSteer had previously tried to steer the car into the barrier at that point repeatedly on prior occasions, and the driver was testing the AP upgrade to see if it performed better, then he would have been alert and ready to grab the wheel if it happened again.

Tesla says that the driver didn’t have his hands on the wheel at the time of the accident — or at least that the car didn’t detect any attempt by the driver to steer.

That’s not “waiting until the last second” to test the system. It’s just not paying attention to where the car is steering itself.

And insinuating that Tesla would be lying about that makes even less sense. If Tesla was going to lie about the accident, then it would simply claim AP/AutoSteer wasn’t in control at the time of the accident.

(⌐■_■) Trollnonymous

Maybe they should follow GM’s lead where 124 deaths from the ignition scandal?
What was it? 120Million paid out for 124 deaths? So GM values their murdered customers at less than $1 million each?
https://www.usatoday.com/story/money/cars/2017/10/20/gm-settles-deadly-ignition-switch-cases-120-million/777831001/

Oh wait it’s actually worse/less than that. The dead have to share it with their injured and mame’d.

At the rate Tesla is going, it’ll catch up to GM’s death count rather quick!

Hey Bro Troll here is a little clue. GM knew about the problem, continued to manufacture them anyways knowing they have killed and would kill more, when confronted by consumers they got lawyers to try and sue them, they lied to the feds and it wasn’t until the government got involved and started going through all of their emails and backlogs that they finally confessed/were outed. Then came the criminal charges.

Ya that sounds a whole lot like what Tesla is doing.

BTW didn’t the same autonomous system that is going on the bolt just run over a pedestrian?

Yes, one pedestrian is no longer with us, The Chevy Bolt, with Uber autonomy, didn’t have a chance to stop, due to many unforeseen issues. Unfortunately for many unsuspecting pedestrians and drivers, autonomous vehicles will be part of making us safer, that is until we actually are.

There wasn’t a Chevy Bolt with Uber autonomy in the recent accident. The Uber pedestrian incident was in a Volvo XC90, with Uber-tech (Volvo’s tech was also turned off). GM talked of potentially partnering with Uber after its rideshare partner, Lyft, talked of competing with GM. However, GM is also considering its own network, and the tech on self-driving Chevy Bolts is Cruise Automation tech, not that of Uber. I may be wrong or have misread, but my understanding was that if GM partnered with Uber with its self-driving Bolts, those cars would by GM/Cruise outfitted vehicles for use by Uber, not Uber-tech cars.

I thought they both were Mobileye. Call it, name it whatever, but if it’s the same system it’s the same system.

GM was in some process of teaming with Mobileye for HD mapping data, but the continuance of such plans hasn’t been confirmed. Cruise Automation’s system is completely independent of any system used by Uber. GM acquired Cruise and they are building self-driving Bolts in-house, with no partnership situation with Uber or Mobileye related to the actual autonomous tech. There’s no connection between these systems, their engineers, etc. At CES in 2018, Uber revealed a new partnership with NVIDIA to produce its own autonomous vehicles.

Uber AV that killed the pedestrian has nothing to do with GM. Nice try though.

And 2 fatal accidents by Tesla’s running with AP (this one and the Florida one where the stoned truck driver made an illegal turn in front of Tesla) does not equate to GM’s 124 confirmed deaths by their ignition switch scandal.

But nice try at setting up a false equivalency there mental MadBro!

@bro1999 @Get Real

GM Vehicles recalled worldwide due to ignition switch: ~29,000,000
Deaths: 124
Percentage of total affected vehicles: .000004%

Tesla Vehicles with Autopilot through April 2018: ~120,000
Deaths: 2
Percentage of total affected vehicles: .000016%

So who wins this argument?

Neither! Because arguing “My car company killed less people than your car company” is a losing argument for both parties. 😉

Thank you.

Wade, my points are that:

#1 Obviously 124 deaths>2
#2 GM actively covered up those 124 deaths whereas Tesla has actively worked with the investigatory agencies to recover the data and have NOT covered up like the OLD GM did with the much broader ignition switch crime.

The Bolt is completely unrelated. It uses a Cruise Automation system, which has nothing to do with Uber.

I thought they both used Mobileye. I know Uber has it and so does GM.

Good to know, I was mistaken, obviously! Thanks for the correction, and the accurate information about who is partnering with whom on all the autonomous ride share platforms.

“At the rate Tesla is going, it’ll catch up to GM’s death count rather quick!”

Hmmm, based on two people killed in cars driven by AutoSteer in something like 2-1/2 years… versus probably dozens of lives saved, at least?

Well, that’s about par for the course for your Tesla bashing claims. Not merely contrary to the facts, but completely Ludicrous™!

OK – ‘Real Quick’ – is a Stretch, but they could actually catch up – if they do kill one person each 2 years, in about 60 Years!

How is the GM ignition switch debacle even remotely analogous to this Tesla crash?

You sound like fanboi trying to excuse Tesla for doing something wrong, but by making that analogy you actually insinuate that Tesla are criminally responsible for this latest crash. That’s quite a stretch, don’t you think?

I agree, but such fallacious reasoning doesn’t hold a candle to the howling-at-the-moon ridiculous arguments being made by several Tesla Hater cultists here!

Seriously, someone intentionally “testing AP to the bleeding edge” didn’t even grab the wheel at the last second?

Who would be stupid enough to believe such obvious cabbage?

How surprising. Another fanboi can’t have a civil discussion without lapsing into Ludicrous mode.

Key point is ** moments **. Drivers make choices in how they use driver assistant features.

“In the moments [a few seconds?] before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. [1 out of 7] …”

Which resulted because of Tesla advertising AP as a form of self driving when all it really is is glorified cruise control. Keep spinning, dbag Scott.

LMAO, serial anti-Tesla troll and d-bag, GM dealership employee mental MadBro tells others to not do what he does all the time in carpet-bombing fashion

I’m glad Madbro has set the record straight, his carpet bomb approach, along with his barrel bomb back up, is a bit overwhelming.

(⌐■_■) Trollnonymous

I’ll beat my same ol drum here.
I prefer they sell versions without all the AP crap. That would be the version I buy.
Oh yeah, and don’t make it a necessary/forced upgrade to be able to get other packages either.

There’s a new ABC news video where they drive in the HOV lane where the accident occurred. At 2:04 into the video, it shows that the HOV lane and the two lanes to the right are asphalt, then suddenly the two lanes to the right if the HOV lane turn into concrete while the HOV lane stays asphalt. The break between the concrete and asphalt then starts to move over into the HOV lane then back out of the lane right into the barrier!!!

See video in link below:
http://abc7news.com/automotive/exclusive-autopilot-part-of-tesla-crash-investigation-i-team-rides-in-model-x-to-site/3284757/

Yeah that looks really bad. I can see how a computer might have been confused by that.

Also, hands on the wheel really doesn’t matter. It’s eyes on the road. If you looked down for just a couple seconds to change the radio station, or grab a drink or whatever, that would be enough time for the car to veer out of the lane and into the barrier there.

I wonder what the previous crash there was? Hopefully it wasn’t another Tesla.

Hey, thanks for this link AnonyMouse! That was pretty good news coverage, with some very thoughtful comments. “At 2:04 into the video, it shows that the HOV lane and the two lanes to the right are asphalt, then suddenly the two lanes to the right if the HOV lane turn into concrete while the HOV lane stays asphalt.” But it’s not “suddenly”. The one lane changes from asphalt to concrete quite gradually, on a very shallow angle across the lane, taking several seconds at highway speed to run the distance over which the lane fully transition from asphalt to concrete. I can easily see why Autopilot might confuse that sharp contrast between asphalt and concrete with the edge of a lane. So if that was the cause, then why did ~200 Tesla cars pass the same point within the two weeks since the barrier was collapsed by a previous (non-Tesla) accident, all without crashing into the barrier? It looks like the gradual replacement of asphalt with concrete in the one lane was a contributing cause to the accident, but likely not the only cause. If the report is right, that this engineer had complained several times about his car “trying”… Read more »

I have to add that I am happily surprised that it took as long for something like this to happen. With the videos we have seen of no driver in the front seat and everyone in the back, etc. while it’s clear that Tesla wants it to be driver assist people think it’s more.

You have to wonder if there was LIDAR in addition to the rest if it would have realized that concrete barrier was solid and not where the car wanted to go and either drive in the proper lane or slam on the brakes.

1. Yes, I do find it amazing that all those “Look Ma, nobody in the driver’s seat!” videos posted, when AutoSteer was new, didn’t result in any fatality. If any of those idiots had been killed, they would have deserved a Darwin Award! 2. It took me some time to realize, and I think most readers here still don’t realize, just how little data a Tesla car gets from its low-resolution Doppler radar detector, and how vague the “picture” it gets even with the forward view, let lone the 360° “picture” that a car would need for reliably safe fully autonomous driving. Tesla’s cars are very, very far away from having the reliable SLAM* they will need for Level 4 or Level 5 autonomy. A reliable SLAM needs real-time active scanning using lidar and/or high-resolution radar. *SLAM stands for Simultaneous Localization And Mapping technology, a process whereby a robot or a device can create a map of its surroundings, and orient itself properly within this map in real time. A picture is worth 1000 words: It was only after seeing the picture linked below that I understood how very “fuzzy” the picture is, which a car gets from an anti-collision… Read more »

“driver had about five seconds and 150 meters of unobstructed view…”

This translates to 30 meters / second, which is roughly 67 MPH. He was probably going no more than 70 MPH.

Just guessing here, speed limit on that part of the freeway is 55 mph?

I can’t recall a california freeway with a speed limit below 70 for anything other than big rigs.

No, California rarely allows 70 mph. Since this is near Mountain View it almost can’t be over 65 mph, and may well be 55 mph.

Not that I think it makes any difference. The speed limits were changed not because they were dangerous, but because the Federal government was trying to save gas.

A crash at 80 mph has a lot more force than one at even 65 mph. People cavalierly speed, and for the most part it’s not going to kill you. But if you do have the misfortune of getting in an accident you’d be much happier to have been driving 65 rather than 80.

I’m 34, born and raised in southern California with a driver’s license since 16. I reiterate; I can not recall a freeway with a speed limit below 70 MPH for anything other than a big rig. do I need to take photos as evidence? because you’re clearly a DFC here you go http://www.dot.ca.gov/hq/roadinfo/70mph.htm read the text to see ’55’ is a ridiculous notion.

As a SF Bay Area resident there are very few highways where you can drive at the posted limit. General traffic speeds are 70-80 MPH routinely.

You can always drive at the posted speed limit. Just stay in the right lane and people will have to go around you.

That being said, during my very limited time in the Bay Area there was so much traffic that it was impossible to even go at the posted limit, let alone over it.

Tesla’s statement:
“The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe.”

Is this mere arrogance – or panic ?
However, the consequences for Tesla of a ban of this “autopilot” would be extremely severe.

1. Let’s be accurate: The question is about using AutoSteer, not Autopilot. The driver can switch AutoSteer off, but Autopilot is always one.

2. Tesla is 100% correct here. There’s no logical argument. The NHTSA reports that Tesla cars with AutoSteer merely installed — not necessarily operating — have a ~40% lower accident rate. Logically, that means that when under control of AutoSteer, the accident rate is reduced by even more than 40%!

Arguing that everybody should turn off AutoSteer because of one (or two) fatal accidents would be every bit as foolish and stupid as arguing that everybody should turn off their air bags because there have been ~20 fatalities associated with exploding air bags.

In the real world, there will always be fatal car crashes. Autonomous driving, or semi-autonomous driving such as with AutoSteer, isn’t going to lower the accident rate to zero. Asking for that is wishful thinking; it’s pretty strong wishful thinking.

In the real world, where things such as inertia and momentum exist, moving down the road at highway speed is always, always going to have some danger associated with it.

arrogance? tesla? naaaaaaaaaaah

of course ” the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view” might sound a little like panic to some.

Autopilot was on

The barrier was not detected by Autopilot.

The brakes would have been applied if Autopilot had detected the barrier.

Autopilot ignores static objects in the road.

That’s crazy

There is no automatic braking system on the market today that will detect static objects at highway speeds.

Daimler Trucks have Automatic braking with Crash avoidance Up to 80 kph/ 50mph in static objects. Legal Truck Speed Limit in Europe.
The Same package can be bought in the US If you bis the detroit assurance package.

Automatic braking in stationary objects ist possible. European trucks can only bei Sold If they at least Brake 20kph before crashing in stationary Targets. This is european Truck legal requirement. Everyone new Truck needs to have a Automatic braking system that can do at least that much, Most can do better.

“Daimler Trucks have Automatic braking with Crash avoidance Up to 80 kph/ 50mph in static objects.”

But stationary obstacles (static objects) of what minimum size?

I recently saw an older Tweet from Elon Musk where he was musing about such things. He speculated that if the object was the size of a moose, then Autopilot should detect it; but if smaller, it wouldn’t. (Obviously that doesn’t hold true in every case, or we wouldn’t have a report of a Tesla car under AutoSteer running into a stationary fire truck. Maybe Elon was only talking about moving objects in that Tweet.)

The end of the collapsed safety barrier, in front of that narrow concrete wall, was a lot smaller than a moose!

Can Daimler’s Crash Avoidance system do better? (Not a rhetorical question. Maybe it can; maybe it can’t.)

Yes, there was an earlier story about this, when a Model S hit a fire truck:
https://insideevs.com/tesla-autopilot-emergency-braking-systems-blind-parked-fire-trucks/

Why didn’t the car stay in the same lane?

Was the car going to change lanes?

Insufficient data to answer that question.

But it has been speculated, and I think reasonably, that Autopilot may have been confused by the gradual replacement of asphalt to concrete in that lane. Autopilot might have interpreted the color contrast between those two surfaces as the edge of a lane, as the line drifted right.

See video in this article:

http://abc7news.com/automotive/exclusive-autopilot-part-of-tesla-crash-investigation-i-team-rides-in-model-x-to-site/3284757/

It’s time to put an end to using your own customers as beta testers for potentially lethal software. This “autopilot” software must be recalled and only used in controlled environment by trained people until it can be shown to confirm to at least a minimal safety standard.

I see a huge lawsuit coming with this. Tesla is screwed, if it wasn’t already.

Or require ‘eye following hardware’ in conjunction with semiautonomous systems on public roads to ensure driver attentiveness. It might have prevented a fatality here, and would almost certainly have prevented the recent Uber tragedy in Arizona.

Serial Tesla basher “Someone out there” said:

“This ‘autopilot’ software must be recalled and only used in controlled environment by trained people until it can be shown to confirm to at least a minimal safety standard.”

Only two fatal accidents in ~2-1/2 years, and almost certainly dozens of accidents avoided, probably some of them potentially fatal.

I’d say that Autopilot + AutoSteer already has been shown to conform to more than a minimal safety standard!

But hey, if you demand 100% safety in a system before you use it, then be sure to shut off the air bags in your car. Those things have killed lots more people than Tesla Autopilot has!

It is at least 4 fatal accidents and that from only ~230k vehicles, making it one of the worst performing “safety” systems on the market.

Nice of Tesla to blame the driver. If their system can’t keep from crashing into a stationary object, nor detect if the driver is paying attention, then it should not be allowed. All of the AP vs Supercruise reviews say that Supercruise is better at these functions. It has a camera on the steering column to verify the driver is paying attention. And it uses detailed pre-mapped routes.

Tesla released AP2 before it was ready for prime time. But, since the split from mobileeye, they haven’t been able to keep up, nevermind leading the field. Since they’re always on the edge of failure, they’re willing to take risks like beta releasing AP. They knew full well people would think it’s a true autonomous driving system, despite legal disclaimers. But, they needed to take the risk to keep people buying their cars. Their actions have cost at least one life.

What caught my attention and suspicion was the initial blog post from Tesla where they were quick to cite CalTran for the death of the driver due to median arrestor not being reset.

In addition, Tesla went out of their way to speak how great AutoPilot was.

Instead of saying they are working with local authorities and NHTSA to determine the root causes of the accident and stop there.

First order of business, is to deflect attention away from the problem, as it applies to the current situation. Second order is to absolve said business, of any wrongdoing, as to its own financial culpability, in the event that pending litigation will result from the problem.

This is the Tesla legal department and customer relations, in full blown CYA mode.

Herein lies the danger of “autonomous” systems. The driver gets bored and attention diverts into non driving distractions. So, for the attention span challenged these cars can become more dangerous. Aside from adaptive cruise control, I really don’t want any of that junk on my next vehicle.

Driving safely still demands your full time attention!

Luckily so far Tesla autopilot has only killed 4 of its drivers. Big problem will occur when it kills other innocent people. It should be outlawed immediately.

Wrong…

Using your logic all cars should be banned until they can guarantee no fatal accidents. Or alternatively using your logic the cars that should be banned are those cars that statistically cause fewer fatal accidents. Either way your logic is flawed if the object is to allow cars but lower accident fatalities.

Their statistics are suspect.
“In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles ***equipped with Autopilot hardware***. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.”
They compare miles traveled in Teslas to ALL vehicles on the road. At the very least the comparison should be narrowed to vehicles of the same vintage as Autopilot Teslas (2015+). Older cars tend to be less safe in a collision, and should be excluded. This statistic is further clouded by the fact that Teslas are one the safest cars when a collision does happen, with our without Autopilot.
Like to see Tesla release fatalities per mile on Teslas with and without Autopilot engaged. The existence of “Autopilot Hardware” has no bearing on the safety of Autopilot, we need to see statistics with Autopilot actually engaged.

@Gary said: “Like to see Tesla release fatalities per mile on Teslas with and without Autopilot engaged.”
———

“…The agency [NHTSA] found that Teslas without Autosteer crashed an average of 1.3 times per million miles driven, compared with 0.8 per million miles for vehicles equipped with Autosteer. That shows that just having Autosteer available on a car — not necessarily enabled at the time of crash — reduced crash rates by about 40 percent. That’s a truly staggering improvement…” source:

Report finds Tesla’s Autopilot makes driving much safer:
http://bgr.com/2017/01/19/tesla-autopilot-crash-safety-statistics-report-nhtsa/amp/

Without knowing more about the data set it doesn’t tell us much of anything at all. How many Teslas have autosteer vs don’t have it? What are the profiles of the drivers of those cars? Are the ones with autosteer more affluent, and possibly better drivers in general? Maybe they aren’t in as much of a hurry and drive slower? Maybe the areas where people tend to buy the autosteer option just have safer roads in general? We don’t know.

This is the ultimate example of how correlation does not equal causation.

It would be interesting to see if that ratio has changed over time. In the early days of AP, customers were likely very vigilant- unsure of the new technology. Now, these same people are probably more complacent – and likely less vigilant than they were when the technology was very new.

The apparent success of AP as revealed in the study is very dependent on drivers remaining alert and maintaining a healthy vigilance. With a level 2 autonomous system, the driver is a critical component of the success of the system.

so you’re saying autopilot passed the same driver’s tests (plural) that drivers did to ensure they were safe behind the wheel? Oh I’m sorry that you presumed to have a well though out argument when in fact you did not.

“Luckily so far Tesla autopilot has only killed 4 of its drivers.”

No, just two, over about 2-1/2 years, and of course with many more lives than that saved. Claims with absolutely no evidence to back them up won’t be counted by reasonable people; only by serial Tesla bashers like you.

“Big problem will occur when it kills other innocent people. It should be outlawed immediately.”

Hey, let’s outlaw every safety-related system in a car that isn’t 100% perfect. Starting with air bags; exploding air bags are now associated with at least 20 deaths!
/sarcasm

“The thing to keep in mind is that self-driving cars don’t have to be perfect to change the world. They just have to be better than human beings.” — Deepak Ahuja, CFO of Tesla Inc.

Another Euro point of view

What a mess this becomes, I mean all of it, not just AP. It is not new but so far the little girl saying ” the emperor has no clothes” was inaudible, now her voice starts to become deafening so people are forced to take a good look.

Analogy FAIL.

We know that use of Tesla Autopilot + AutoSteer does provide a substantially lower accident rate.

No reasonable person would expect that accident rate to be zero; every reasonable person should understand that it’s impossible to have cars moving down the highway at ~70 MPH in perfect safety, no matter who or what is controlling them.

The question should be this: Are you safer with Tesla AutoSteer, or without it? One single fatal accident, or even two, doesn’t at all mean you’re safer with it shut off!

I don’t even care about ACC in my car. I have tested it out on a Tesla and Gen 2 Volt. Its cool, but I wouldn’t use it.

I have gone my entire life with no accidents. I trust me more than the computers at the moment. 😉

I know autonomy will get to a point where i can trust it! But it will be a fully functional self driving system like Cruise or Waymo is working on. Not one of these transitional like Autopilot, Pro Pilot, Supercruise etc. If it isn’t competent enough to handle large stationary objects, then I’d rather keep my eyes focused on the road and completely engaged.

We have a rare case here where its appears

1) The civil lawsuit will go to trial

2) NHTSA will perform a parallel investigation

3) The estate for the driver has indicated the deceased went to Tesla service centers about the issue on multiple occasions (easy to verify)
and receive a thorough investigation in light of recent events.

4) The deceased was relatively technically skilled, young and affluent and it will be interesting to see how the civil lawsuit will proceed (affluent former Apple employee with technical background) with the forethought to report to Tesla the use scenarios where the product did not perform consistently well despite assurances from Tesla.

5) I suspect Tesla may attempt a private settlement to the estate with a NDA but NHTSA will push back on any attempt by Tesla to seal the details of the any failures in design or process based on ongoing concerns about Tesla AutoPilot.

Tesla will most probably settle out of court. There will be no financial disclosure, as to the amount paid to settle the claims of the pending litigation. This 10 million+ settlement, will start to get the New Tesla CFO, and auxiliary bean counters, undivided attention. Tesla institutional shareholders, are going to have start to ask some tough questions, and for some accountability on this pressing issue.

As I have stated before, there are flaws with this argument. If it was true that the driver had experienced 11 prior incidents where he had to take evasive action at that same barrier, then why would he not be alert this time? Makes no sense to me. OTOH I do not believe he had previous problems or reported to Tesla service as Tesla claims. He clearly was not paying attention and was confident that the AP was working properly.

The road looks very much like an area where autopilot could have problems. See the news report linked above.

How about this explanation: he knew AP had problems. He normally kept his eyes on the road and could adjust for those problems. He just had the misfortune of looking away/getting distracted just before arriving at the problem area.

This blaming the victim thing really isn’t good.

I agree, and since he was an Apple engineer and therefore should have had a better understanding than the average person that computer software can’t be made 100% reliable, then he should absolutely have made a point of being alert and ready to take over at the place where the accident occurred, if he really had had such a big problem that he had reported it to his relatives and to his local Tesla service center. Either that, or there was something else going on that hasn’t been reported, such as him being asleep at the wheel. There are still a lot of possibilities. One possibility is that the driver had indeed reported a problem with the car abruptly changing lanes, or wandering out of a lane, for no good reason, under AutoSteer; but that the driver had not identified the one spot on the road as the problem. If he had identified that spot, then he had been repeatedly warned about the danger at that spot. If that is the case, then he was certainly responsible for the accident. Tesla bashers whining about “blaming the victim” is astoundingly hypocritical and tone-deaf here. Tesla has specifically engineered AutoSteer to periodically… Read more »

Silly question, why isn’t there a camera in or near the dash, aimed at the driver’s hands and face to show where the driver’s attention is focused?

Hello, Elon?

It would be a definite improvement, and it’s surprising that Tesla hasn’t included it. Actually “hands on the wheel” is pretty irrelevant. You can have your hands on the wheel and not be watching the road. In fact, it would be better to be watching the road with your hands in your lap than holding the wheel but looking away.

That being said, this isn’t a complete panacea for autonomous driving distraction because it’s still possible to be looking at the road but not actually paying attention.

“…it would be better to be watching the road with your hands in your lap than holding the wheel but looking away.”

And in fact, an official Tesla video from more than a year ago (Nov 2016; see link below), an AutoSteer demo, shows the driver with his hands in his lap the entire time.

It seems reasonably clear to me that Tesla’s current message is not “You need to keep your hands on the wheel when using AutoSteer”. Tesla does seem to be sending mixed messages, but overall I think the most reasonable interpretation of the various messages Tesla is sending is “You need to remain aware of the road, and be ready to grab the wheel at any moment, when using AutoSteer”.

https://insideevs.com/tesla-releases-self-driving-demonstration-with-recognition-feed-video/

Tesla will have to go back to the drawing board with AP. If not voluntarily then by government order. Personally I never thought it was safe. The entire idea that the car will drive itself but you have to be ready at any moment to take control seems untenable. Better to simply provide driver aids and force the driver to be in control.

The full throat defense of AP as “safer” is not only unrealistic but also just tone deaf. At this point, given the circumstances and how close this follows to the Uber accident, Just won’t work with the public or regulators.

+1, best comment of the thread IMHO

This public statement by Tesla is deafer than a concrete barrier.

Ouc! That’s a very blunt “concrete barrier”!

Let’s hope it’s not falling on deaf ears over in Freemont, Ca. and Sparks, Nevada.

I doubt if anything will come of it.
You are driving the car, you are responsible.
A person drove off a cliff in a loaded SUV in CA recently, who was at fault? The driver.

It wasn’t an autonomously driven SUV, was it? If not, then clearly the driver was responsible.

If the driver of this Tesla car had been warned 10 or 11 times previously that AutoSteer malfunctioned at this point in the road, then how can you possibly blame Tesla for the accident?

I dunno about you, but if I had been driving this car and AutoSteer had malfunctioned twice at the same point on the road, then I would have made a point of having my hands on the wheel when driving thru that area!

What is obnoxious, tone deaf, and hypocritical, is claiming that even with 10 or 11 very clear warnings to the driver about a problem with AutoSteer at that spot, that somehow Tesla is still responsible for the accident!

Sounds like you need a fixed object alarm if it doesn’t brake for fixed objects, it must see them and just not react to them.

Even airplanes let you know by alarm before you hit the buildings.

These systems had too many false reads. Not good to slam on the brakes on the highway for an obstacle that isn’t really there.

That only points up the need to fix that problem before you lull people into believing it is infallible.

Yup. Just look at the literally hundreds of trees to the side of the road which Tesla Autopilot mistakenly detected as “in-path” objects in the side views in the second, real-time video in the article linked below.

If Autopilot sounded a warning every time it detected an object “in the path” of the car, then the warning would quite literally be going off constantly. That’s of no use, as the driver would quickly shut off the warning, or shut off the system giving the warning.

With too many false positives, a warning becomes useless.

https://insideevs.com/tesla-releases-self-driving-demonstration-with-recognition-feed-video/

Here’s an idea. After 3 warnings from the car for the driver to pay attention, the alarm system should activate. Flashing lights and beeping horn. The thought of such embarrassment should be an incentive to pay closer attention. This would also alert other drivers close by to keep a distance.

This could also be a life saver if a driver blacks out from a medical condition.

And the car just keeps driving down the highway ( horn now blaring away) – even if the driver has blacked out?

Yeah, I don’t know how much that would help anything. It would just distract other people and no one could get the car to stop anyway.

Well I forgot to add that if they took a page out of the Nissan Pro Pilot book, the vehicle gradually slows down and comes
to a stop. Very dangerous on a freeway to be sure, but no more dangerous than veering out of control, bouncing off the barrier and into surrounding cars.

With all due respect to the deceased, sometimes people just make very unfortunate lethal mistakes. As for the pedestrian struck by the autonomous Volvo, who in their right mind would attempt to cross the street at night with a vehicle coming down the road?

Paul K said:

“Well I forgot to add that if they took a page out of the Nissan Pro Pilot book, the vehicle gradually slows down and comes to a stop.”

I guess you are not aware that if you ignore repeated requests to take the wheel, Tesla AutoSteer pulls the car over to the shoulder and brings it to a stop.

That is certainly far safer to everyone on the road than continuing to blindly drive down the highway with the horn blaring!

https://jalopnik.com/watch-how-teslas-updated-autopilot-freaks-out-when-you-1787112548

If I take my hands off the wheel of any other car for 5 seconds at highest speed, what are the chances of a crash?

Probably pretty low unless you’re going around a turn. A car will generally maintain a straight heading at least for a few seconds even with your hands off the wheel.

But more to the point, people don’t take their hands off the wheel in a car that they are driving, because of the obvious consequences of doing so. The issue in these not-so-autonomous cars is that people are lulled into thinking they *can* take their hands off the wheel. Mr. Huang wouldn’t have taken his hands off the wheel and driven into that barrier in a normal car.

You never drive where I live. Not unusual to see people getting ready for work while driving, putting on makeup, tying ties, all sorts of things except paying attention to the road. Surprising there aren’t more accidents.

“Mr. Huang wouldn’t have taken his hands off the wheel and driven into that barrier in a normal car.”

Reality check: In a “normal” car, people crashing into fixed obstacles is such a regular occurrence that it’s not reported as “news”, whether they had their hands on the wheel or not. If crashing into the end of that concrete wall wasn’t a regular occurrence, then there wouldn’t have been a collapsible barrier installed there.

For example, the barrier in question had been crashed into and collapsed approximately two weeks prior to the Tesla accident in question. Can we find any news report of the prior crash? Of course not! Because it wasn’t an accident involving a Tesla car, and therefore wasn’t considered “news” by any news agency.

You would not try that of course. But the implementation of AP, its name, Tesla selling full self driving packages, Musk talking about how good their system is and it will soon be able to cross the USA from west to east, its style of marketing… All that makes people trust in a system, that must not be trusted.

Words Have Repercussions It is time to separate the ‘genius’ from the daily operations of Tesla. Elon speaks more like a startup entrepreneur than a senior executive where the communication they share with the public has major consequences. Elon blurs hopes and deliverables in the same sentence as if it has little consequence. He signals future products as if they were a given when he has zero funding / resources behind these announcements. Some think this is bold and refreshing but there are consequences. The legacy auto manufacturers are very fuzzy in their EV language and EV deliverables because the leadership understand the markets react to what they announce and valuations of the company are impacted accordingly. Tesla does this but for different reasons. Elon (is no dummy) systematically makes statements because the valuation of the company desperately needs this constant PR hustle. If Tesla was judged solely on it’s current state, the company would be in greater trouble financially. Elon is selling the future with prototypes and fanfare hoping investors will assume the risk and fund these efforts. It is unclear whether Tesla can consistently fund the existing state with sales revenue.
M3 owned, Spark leased; Niro TBD

Make good choices. Automation can only take one so far regardless what the packaging says.

I LOVE AP2 and what it does well. It doesn’t take away the watchful eye.

It’s STILL a teenager driver and should be treated accordingly.

Why did the car steer to the right towards the concrete barrier?

Why didn’t the car just stay in it’s lane?

If it had to change lanes (go to the right), then it should have done that much sooner. Because changing lanes at that point was just simply too late, because the car had come too close to the concrete barrier.

If the car had not steered to the right, it would just have followed it’s lane, and there would not have been an accident with this Tesla Model X.

The driver did not steer to the right.

This has now become a very difficult situation for Tesla.

The people who want to see Tesla go down, they are going to sharpen their pencils.

I have been reading about Tesla since many years and that felt really good.

But right now, I feel very sad.

The lanes there have a change between asphalt and concrete that looks like it could confuse AP. See the news report linked above in the comments. When they drive down that stretch of highway you can see how a computer might be confused by the interaction between asphalt and concrete in that area, as well as the diverging white lines.

Autopilot must not be allowed to get confused.

As long as there are situations in which Autopilot can get confused, it cannot be trusted.

Why did the driver not take back control of the steering wheel during the last 6 seconds?

“Why did the driver not take back control of the steering wheel during the last 6 seconds?”

This is what the Tesla bashers, who are in full cry in this discussion thread, want you to distract you from asking.

We can believe that the car might have unexpectedly swerved into the barrier, after being confused into thinking the moving dividing line between dark asphalt and light concrete in the same lane, was a lane marking.

Or, we could believe that the car had previously tried to steer the car into that same barrier no less than 10 or 11 times in the days or weeks leading up to the accident, as Tesla bashers are also claiming.

But only a crazy Tesla Hater cultist could believe that after the same thing happened 10 or 11 times at that very spot on the road, the car doing that would be “unexpected” for the driver!

What this boils down to is another case of…. BLAME AUTOPILOT!

(Did you get pregnant? Just blame Autopilot!)

http://blameautopilot.com/

On my i3 I have experimented by coming up on stopped traffic and then turning on the ACC. The car will most often detect the stopped car and slow my vehicle to a stop. But on the other hand, I have hugged the curb coming up on a parked car and I am quite certain my car would not have stopped. Either way , just no way any sane person would be totally reliant on these autonomous systems at the present time.

Accidents are unfortunate and sad, but biggest advantage with Auto Pilot is Tesla could learn so much from accident, there is so many data point can be analyze and learn from it , if it is non autonomous car accident, its hard to learn anything . Here is what Tesla should do it right now with Auto Pilot, right now when you driving Tesla car either Auto Pilot is available or not, how about Auto Pilot availability with confidence level, say when Auto Pilot is 99.9% confidence, have a Amber color icon available on dashboard, if Auto pilot is confidence is at 99.99999999999999 then have a green color icon available on dash board. this way instead of having 5 second warning before Auto Pilot loose confidence, based on fleet data on known trouble area dash board icon should be Amber.

Sorry to say this Tesla, Autopilot is disaster, it kills people.

No, crashing edge-on into a narrow concrete wall at highway speed, after the safety barrier there had been collapsed and not repaired, kills people.

Tesla Autopilot and AutoSteer are intended to reduce the frequency at which things like that occur. But unfortunately, since humans are not rational animals, far too many people react to tragedies like this emotionally, rather than reasonably. Far too many people can’t seem to accept the reality that such systems, while increasing safety, will never provide 100% safety.

In other words, let’s BLAME AUTOPILOT!

http://blameautopilot.com/

Sorry to say this, but new usernames just suddenly appearing like this Mor(on)ic are indicative of existing trolls rolling out new usernames in order to try and fake more feigned anger at this tragedy in order to further their own agenda of derailing Tesla. Tesla and the Ev car forums like InsideEvs is getting heavily trolled and some of it is certainly coordinated. Don’t forget the Koch Roaches with their billions$ have already announced their war on EVs and furthermore Tesla is an imminent threat to multiple, other powerful industries and individuals. Obviously we don’t know and probably never will what exactly caused this accident but it is a tragedy for the deceased and his family and a setback for Tesla. But we need not be swayed by the trolls whose feigned sympathy and outrage only serve to mask their true motives which is profit or ideology or defense against the disruptions that Tesla is bringing to long-established industries. In the route to better technology there are always setbacks and sometimes tragedies. As disappointing as it is, it shouldn’t prevent the advancement of tech that has already proven itself to be statistically safer to what used to be the major… Read more »

Tesla will be fine as long as they can ramp up M3 production.

I don’t like the needlessly over complex computer controls for the M3, such as as the glove box, hood, and trunk. However, I would still love to have one.

The faster Tesla can grow, the quicker they force the rest of the automakers to switch from ICE to electric. Growth requires a great deal of investment, especially when starting an industry from scratch.

I think worst case scenario is Tesla gets bought by Apple or Google, which might speed mass EV adoption up. It would be nice to pair Waymo with Tesla.

Can you elaborate more onbthe hood?

Get FUD must be suffering from tin foil ovedose.
Open the windows in your parent’s basement, man!

Mental MadBro doesn’t like the fact that the single version of the Tesla Model 3 now being sold already out produces and outsells the Chevy Bolt even though it is very early in its ramp up.

Next the new Hyundai Kona/Kia Niro with much nicer interiors will start eating into the Bolt’s sales from below since they will be less expensive as they will have plenty of tax credits long after GM’s runs out.

If you speakership lays you off mental MadBro you will have to move back into your mommies basement.

How many things went humanly wrong?

Exceeded posted speed limit (user set?)
Disregarded instructions to keep hands on wheel
Obviously not paying attention to the road
Knew there were issues with AP at that segment of freeway
Continued to use AP despite knowledge of problems there
Failed to react to visual cues
Failed to react to audible cues
Maybe failed to understand that AP is not Autonomous

Minus one or two items, This almost sounds like a medical event occured, which has happened many times in other cars.

Thank you! This is one of the few sensible comments in this discussion, in which far too many Tesla Hater trolls are using a human tragedy for their purposes of tearing down the public image of Tesla Inc., a company trying hard to make the world a better place!

Indeed, the #1 question we should be asking is: If the driver had been strongly warned by 10 or 11 previous near-accidents at the same spot on the same highway, then why the heck didn’t he at least have his hands on the wheel when he drove by this place?

If all that is true — a rather big “if” since the only basis for that claim is the word of his brother, the potential beneficiary of a very large sum of money in a lawsuit — if all that is true, then it does suggest the driver may have been asleep at the wheel, or physically or mentally incapacitated for some reason.

I am copying and posting a comment (no. 1098) on the TeslaMotorsClub website because I believe it is the BEST ANALYSIS of the misleading nature of Tesla’s statement. Here it is , with credit to ‘kernel’ at TMC: Also found here (posting 1098): https://teslamotorsclub.com/tmc/threads/model-x-crash-on-us-101-mountain-view-ca.111505/page-55 —————— Tesla’s statement really, really bothers me. Let’s examine its pieces: > In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged Ok, so far so good. > with the adaptive cruise control follow-distance set to minimum. Irrelevant. The X didn’t rear-end a car. It certainly wasn’t following the concrete barrier. Based on a later sentence in this paragraph, the driver had an “unobstructed” view so wasn’t following too closely. > The driver had received several visual and one audible hands-on warning earlier in the drive Irrelevant and intentionally misleading. I drive on AP2 often, and it is very difficult to complete a drive without some sort of hands-on warning going off, even when my hand is constantly on the wheel. Also, “earlier in the drive” is irrelevant to the accident and designed to mislead as it was almost certainly long before the accident. > and the driver’s… Read more »

unfortunately AP skeptics are probably on the exact same page whereas tesla fanatics won’t be swayed regardless.

“…the fact that the driver’s hands were on the wheel six seconds prior to the collision strongly suggests that the warnings were well before this point”

WOW! Logic isn’t your strong suit, is it?

No evidence has been provided that the driver’s hands were on the wheel 6 seconds prior to the collision. Tesla says they have data to show that his hands were not on the wheel for the 6 seconds preceding the accident. Perhaps that’s all the data the “black box” in the car saves for that particular variable. At least, that’s a reasonable conclusion, contrary to all the very bad reasoning and biased viewpoints in the comment you copied.

But hey, “congratulations” on finding a Tesla-bashing comment from TMC to cherry-pick and re-post. I guess you’ll get points for that in your Tesla Hater cult ranking? 🙄

BLAME AUTOPILOT!

http://blameautopilot.com/

“I guess you’ll get points for that in your Tesla Hater cult ranking?”

Yes! Exactly! As a matter of fact, I earned enough points this week to earn a free blender.

We’ll probably never really know what happened.
How many times do you see someone suddenly swerve from one lane to another because they missed their turn off? Even though Tesla say there was no hands on the wheel, could it be that the last seconds were not logged due to the accident, and this driver realised AP had them on the wrong path so swerved to correct?
AP doesn’t monitor the driver (real oversight on the design), maybe the driver had a medical emergency so couldn’t respond to AP alerts.
It does seem strange that the owner supposedly had concerns about this stretch of road and obviously wasn’t paying attention. It certainly looks like a bit of road you would want to be monitoring IMO.

If the data from the last 6 seconds prior to the accident were not logged, due to physical damage to the car, then Tesla would not have stated as fact that the driver’s hands were not on the steering wheel; that is, that no pressure or torque was being applied to the steering wheel by the driver. That indicates the data for those 6 seconds was available, not that it wasn’t.

And I don’t think we actually know that much about exactly what happened, but I get the impression that it wasn’t that the car suddenly swerved across multiple lanes to an exit; it was that the car wandered within a widening lane, or between two lanes which had just split, into the barrier between those two lanes. The moving line between dark-colored asphalt and light-colored concrete, in the lane leading up to the split, which may have confused Autopilot, does not appear to be any reason for Autopilot to have sent the car careening across multiple lanes.

The driver would have turned the wheel to the left in during the last few seconds in order to avoid a collision with the concrete barrier, IF HE WAS PAYING ATTENTION AND IF HE WAS ABLE TO DO SO.

He probably had too much faith in Autopilot.