UPDATES: Tesla Model X Crashes While Reportedly In Autopilot Mode

JUL 6 2016 BY ERIC LOVEDAY 108

Tesla Model X

Tesla Model X

Tesla Autopilot

Tesla Autopilot

With more and more miles being racked up in Autopilot mode in both the Tesla Model S and X, it’s inevitable that more incidents will be reported, the question now will be was Autopilot really to blame or was it human error – or just humans laming the technology after the fact?

It seems Tesla may be delving into driving logs more and more as we move forward.

A fatal crash that occurred back in May that just started making headlines last week is the highest profile Autopilot-related crash, but it’s certainly not the only one.

Albert Scaglione was in his 2016 Tesla Model X, reportedly in Autopilot mode, when it crashed and rolled over on July 1st on the Pennsylvania Turnpike.

UPDATE: Tesla tells us the following:

“We have no data to suggest that Autopilot was engaged at the time of the incident. Anytime there is a significant accident, Tesla receives a crash detection alert. As is our practice with all collisions, we immediately reached out to the customer to make sure he was safe. Until the customer responds, we are unable to further investigate.”

UPDATE 2:  Tesla has re-stated the information received on the crash

“We received an automated alert from this vehicle on July 1 indicating airbag deployment, but logs containing detailed information on the state of the vehicle controls at the time of the collision were never received. This is consistent with damage of the severity reported in the press, which can cause the antenna to fail. As we do with all crash events, we immediately reached out to the customer to confirm they were ok and offer support but were unable to reach him. We have since attempted to contact the customer three times by phone without success. Based on the information we have now, we have no reason to believe that Autopilot had anything to do with this accident.” – via Electrek

As the Detroit Free Press reports:

“A Southfield art gallery owner told police his 2016 Tesla Model X was in Autopilot mode when it crashed and rolled over on the Pennsylvania Turnpike last week.”

“Albert Scaglione and his artist son-in-law, Tim Yanke, both survived Friday’s crash near the Bedford exit, about 107 miles east of Pittsburgh.”

The Detroit Free Press was able to get in touch with officer-on-the-scene Dale Vukovich of the Pennsylvania State Police. The officer described the incident as follows:

“Vukovich stated that Scaglione’s car was traveling east near mile marker 160, about 5 p.m. when it hit a guard rail “off the right side of the roadway. It then crossed over the eastbound lanes and hit the concrete median.”

“After that, the Tesla Model X rolled onto its roof and came to rest in the middle eastbound lane. A 2013 Infiniti G37 driven in the westbound lane by Thomas Hess of West Chester, Pa., was struck by debris from the Scaglione car, but neither he nor his passenger was hurt.”

Scaglione was injured, but survived the crash.

Police say the Model X driver will likely be cited for a violation, but it’s unknown what he’ll be cited for at this point in time.  By default, Autopilot must be activated by a driver, and when the feature is engaged, drivers are reminded to keep their hands on the wheel.

The Free Press also notes the road conditions in that area:

“Anyone who has driven on the Pennsylvania Turnpike knows that its narrow shoulders and concrete medians leave little margin for driver error. There’s not enough evidence to indicate that Tesla’s Autopilot malfunctioned.”

We will update as new information becomes available.

Source: Freep, hat tip to Jamie H!

Categories: Crashed EVs, Tesla

Tags: ,

Leave a Reply

108 Comments on "UPDATES: Tesla Model X Crashes While Reportedly In Autopilot Mode"

newest oldest most voted

Hit a non-moving guardrail while autopilot was engaged? Seems clear cut.

Don’t blame Tesla’s Autopilot! It’s not AutoPilot’s fault! Not only was the guardrail stationary, it was painted white!!! And there was a bright sky!

Bright sky in Pennsylvania?

“Don’t blame Tesla’s Autopilot! It’s not AutoPilot’s fault!” – sven

Of course it isn’t; how would that even be possible in this case ?

“We have no data to suggest that Autopilot was engaged at the time of the incident. Anytime there is a significant accident, Tesla receives a crash detection alert.”

This week, yes.

agree – cite the guardrail for failing to yield to autopilot

For each accident involving high speed violation don’t blame the drivers, blame the car makers for building so fast cars! Or the authorities for letting them building them.
With system like autopilot, not only Tesla but also other car makers will be confronted with bad drivers that now have a excuse to deny their own responsibilities and wrong driving behaviour…

I WONDER WHY THE DRIVER DID NOT TAKE MANUEL CONTROL OF THE CAR WHEN HE SEEN IT HEADING TOWARDS THE GARD RAIL…????OR EVEN AFTER THE CAR CAME IN CONTACT WITH THE GARD RAIL , THERE WAS STILL TIME , TO MINIMIZE THE DAMAGE AND THE CONSEQUENCES …DUHHHHH

reportedly in Autopilot mode”

Tesla has now reviewed the logs and says that there’s “no data to suggest that Autopilot was engaged at the time of the incident.”

How much you wanna bet the owner’s son was driving?

Just like the other Model X that accelerated into a building, it is so easy to blame AutoPilot.

Wreck your father’s/husband’s brand new $150K car? Of course it’s not your fault.

Heck, even my wife insists it’s not her fault when there is damage on her car.

Tesla did NOT review the logs.

Tesla updated its statement to now say that “logs containing detailed information on the state of the vehicle controls at the time of the collision were never received.”

That puts Tesla’s original statement into a new light:
We have no data to suggest that Autopilot was engaged at the time of the incident.”

Tesla has “no data to suggest” because it NEVER received the logs from the Model X.

It is ill-advised for Tesla and its public relations team to be disingenuous and twist the truth in its public statements in response to this crash. Face palm.

http://electrek.co/2016/07/06/tesla-model-x-rollover-accident-tesla-says-autopilot-not-engaged/

http://electrek.co/2016/07/06/nhtsa-probing-tesla-model-x-rollover-accident-pa-autopilot-involve-tesla-updates-statement/

I see sven is continuing his crusade to make it safe for Tesla bashers to post FUD on InsideEVs, and use these comment threads to promote his stock shorting position!

sven is the anti-Don Quixote. The “Man of La Mancha” undertook a futile and delusional quest, just like sven here. But Don Quixote’s motive for his quest was noble, chivalrous, pure, and true. sven? Not so much…

sven wrote:

“It is ill-advised for Tesla and its public relations team to be disingenuous and twist the truth…”

But apparently you think it’s okay for you to do exactly what you’re falsely accusing Tesla of doing, hmmm?

If hypocrisy was an Olympic sport…

Bite me.

Delete your account

Autopilot engaged! Now cover the wheel and brakes just in case!!!

It’s dangerous technology that needs years more testing and better sensors.

Ya, so are guns. Let’s get rid of auto pilot because it is dangerous and keep guns because they are safe.

Shots fired!

“A well regulated Autonomous Driving Vehicle being necessary to the security of a free state, the right of the people to keep and bear Teslas Autopilot shall not be infringed.”

sven channeling Charlton Heston

I’ll give you my Autopilot when you pry it from my cold, dead hands. . . D’oh! AutoPilot is handsfree. Nevermind.

Tesla autopilot is not handsfree. Cold-dead should be the way it works (for now). Unfortunately, too many are so quick to let go.

InsideEVs should update the story. Other news websites are reporting that the Model X driver in the Pennsylvania crash was watching the Disney movie “Frozen” on the Tesla’s center screen while using AutoPilot. Apparently, when the song “Let If Go” came on, the Model X owner took it to heart. Those are some very moving and prophetic lyrics! 😉

“It’s funny how some distance
Makes everything seem small
And the fears that once controlled me
Can’t get to me at all”

“It’s time to see what I can do
To test the limits and break through
No right, no wrong, no rules for me,
I’m free!”

“Let it go, let it go
I am one with the wind and sky
Let it go, let it go”

http://www.metrolyrics.com/let-it-go-lyrics-idina-menzel.html

https://www.youtube.com/watch?v=moSFlvxnbgk

It’s the fault of the automaker so authorities will demand to limit autopilot (or more probably to rename it to something like AutoAssist) because are not seriously paying attention, or more likely, abusing of the system. One day, someone will remember that automakers make cars that can go more than twice the speed limit, so if there are accidents it’s the fault of the automakers and not of the driver who decide to go at 150 mph with his weapon named “car”.
In this particular case, I wonder if the fact the policeman said that the driver would be probably charged it’s not about exceeding speed…

I wanted to say “It’s the automaker’s fault, sure (sarcasm)…” and also “because drivers are not paying attention…”.

I never implied get rid of it. It should be used in it’s current state with driver covering wheel and brake. The technology is not ready otherwise.

I think it is important to have for the data being gathered.

Of course get rid of guns, you nutter!

I love the gun talk in these comments. It seems most commenters agree that Tesla and Autopilot are not to blame for these several accidents, human error or misuse is. After all, we can’t blame the car when the driver does something stupid. Then you turn around and say “of course ban guns!” So when someone does something stupid with a gun, you will blame the inanimate object instead of the “nutter” who misused it?

The logic seems very inconsistent. This message brought to you by your friendly, neighborhood EV driving (and gun owning) conservative. 🙂

Humans in cars are the source of more than 90% of fatal accidents so “It’s dangerous technology that needs years more testing and better sensors”.

IT IS HARD TO BELIEVE THAT THE CAR WAS IN AUTO PILOT…IT DOESN’T ADD UP..PEOPLE MAKE ERRORS ..THEY DON’T REALIZE THAT THIS IS ALL RECORDED, SO THEY TRY AND SHIFT THE BLAME …THIS GUY IS LIEING…

Since when started tesla buying chinese people to make pro tesla wheighted comments somewhere?

How do you know what happened? How do we know that the hardware didn’t had a malfunction and the driver is right? How do you know that tesla is not twisting the truth a little to “stay clean”? Maybe the Autopilot was not activated during the crash, but it was 10s before and the driver did not recognize the autopilot going offline.

There are a bunch of possibilities and even when you are the manufacturer and “see everything” you can not 100% trust the data that you get.

Glad you noticed that many posting here are rushing to judgement, posting either Tesla bashing or Tesla apologist posts, without anyone having real evidence on what actually occurred.

This is precisely the sort of behavior we’ve come to expect from Tesla bashers. Just as with the smear campaign about Model S suspension systems, the FUDsters rush to spread the FUD as fast as possible, before the actual truth could come out.

And the perpetual Tesla bashers will continued to spread the FUD even when it is absolutely and completely debunked. For example, note that sven continues to repeat the FUD about Tesla’s suspension, even though he knows perfectly well there’s not a speck of truth behind that FUD.

Now, some of the Tesla fans are equally rushing to judgement in this comment thread. But at least they honestly believe what they’re posting… unlike the Tesla bashers.

It is so easy to blame auto pilot. How about the basic cruise control that has been existing in cars for over ten years now?

That technology won’t even alarm you if something comes dangerously close. It just keeps the speed if you don’t do anything about it.

It is still the drivers responsibility to watch the road as well the behaviour of the car even with auto pilot activated.

electric-car-insider.com

It’s unlikely we’ll ever know exactly what was happening inside the car before the accident. But it’s another warning to Tesla AP drivers everywhere to maintain situational awareness and be prepared to resume manual driving at a moments notice.

That level of engagement might be difficult to maintain under autopilot.

It’s well-documented that people do not maintain that level of engagement when the car drives itself. It’s a really bad idea to have the car drive itself unless it can handle any potential situation better than a human. Tesla’s autopilot cannot, by a long shot, and really shouldn’t be used at all.

True. Why is it useful to have autopilot sort of drive the car if you still have to completely pay attention for any split second mistake. You’re forced to still drive the car. How was this remotely helpful?

Either autopilot is in complete control and needs to be able to handle emergency situations and notify the driver, or it only handles emergency situations and the driver is always responsible for driving.

The current combination encourages someone to let autopilot have more control than it should have. If something goes wrong and autopilot gives up, there’s no time for the driver to start paying attention and recover.

The only scenario where autopilot seems useful is in slow, stop and go traffic where reaction time can be a little longer. High speed, clear highway is a recipe for disaster if or when anything out of the ordinary happens.

Not only do you have to be as alert to dangers with Autopilot engaged, you have the added reaction time of deciding if Autopilot is going to react (and react correctly).

“There’s not enough evidence to indicate that Tesla’s Autopilot malfunctioned.”

Oh no! Panic and sell your TSLA stock now, at fire sale prices! 🙄
[/snark]

Nothing to see here, but of course that doesn’t stop TSLA short-sellers from using this nothing as yet another excuse to bash Tesla.

And yes, I am looking at you, AlphaEdge.

Pushmi-Pullyu,

You should make a YouTube video imploring everyone to Leave Elon Alone in the same vane as Chris Crocker’s infamous Leave Britney Alone YouTube video. 😀

Censored clean version, safe for work:
https://www.youtube.com/watch?v=pHL9LrPXu64

Hey, if calling it like it is in terms of the good Elon and Tesla have done is offensive to you, then too bad.

Leave Sven Alone!

Sigh. Are there mods here?

Why do we constantly have to put up with this guys absurd accusations. It’s constant slander based on zero evidence.

I have never shorted, or said anything about Tesla in regards to some kind of financial gain. I wish Tesla all the success in the world.

I never even said anything negative in the above post. Warning people to take care with auto-pilot on, is what Tesla themselves recommend. So I guess Tesla is shorting their own stock?

Pushmi-Pullyu, you’re a mental insecure fanboy.

Funny how every time you post you’re really a fan of Tesla, you follow that immediately by some Tesla bashing.

Your protests of innocence have worn rather thin, AlphaEdge. In the story of “The Little Boy Who Cried Wolf”, the villagers no longer believed the little boy after he lied three times. You’ve had more chances than that to leaven your Tesla bashing by at least occasionally posting something an actual fan would say. Can you link to any place you’ve actually posted something positive about Tesla? I doubt it.

“Pushmi-Pullyu, you’re a mental insecure fanboy.”

Gosh yes, that must be why I object to you using a pro-EVs website like InsideEVs as the megaphone for your serial Tesla bashing. 🙄

Show me where I have bashed Tesla????? I have asked you this before and you never responded. I resent these lies and slander about me.

You’re an idiot.

Don’t say anything against Pushy here, or he will label you a Tesla basher and shorter.

sorry to say, but this is getting boring.

LEAVE AUTOPILOT ALONE!!! 🙁

I’m bit disappointed to see how easily the Model X can be rolled over, despite Tesla’s assurances. I look at the pano roof and FWDs in a whole new light now, and if the car should roll over into the water I’m not sure how the second and third row passengers could get out in time, as the FWD override is difficult to find, and the third row is blocked without electrically actuated seats with no manual override.

Turning hard left into a concrete barrier would make any car turn over and play dead.

Shhh. Don’t discourage Four Electrics into his delusion the Model X is a rolling Death Trap whose only purpose is complete and utter destruction of it’s occupants.

Then we won’t have to read his FUD when he leaves…

And yet, Four Electrics claims to have purchased a Tesla car… and changed his screen name from “Three Electrics” to “Four Electrics” as part of that claim.

Odd that he claims to have bought one after being a perpetual Tesla basher for so long. If we didn’t know better, we might almost think he doesn’t believe his own Tesla bashing FUD… 🙄

You beat me to it Bob. Plus, I am seriously impressed that the car hit a guardrail at highway speed, then impacted a concrete median, and still the occupants were able to walk away alive. Must be a pretty well built vehicle.

I see Four Electrics has had another attack of TES*.

Take the cure, Four Electrics! You too can experience a sudden end to your compulsion to post anti-Tesla FUD. All you need to do is sell off your Tesla stock shorting position!

“Tesla envy happens when other people have, ahem, long positions and yours is too short.” –Jim Whitehead

*Tesla Envy Syndrome

I think now I get your name. “If you push me, I will pull you.”

So what proof do you have this guy shorting?

Where are the mods? These guy can just slander anyone?

Sounds like Tesla needs to invent auto-straightening roads. Auto-dimming skies.

Seriously, though, people are too quickly getting used to ignoring the road with autopilot engaged, and the sad part is this will effect other automakers who are doing more testing before putting it out in the real world on their cars. Kneejerk lawmakers won’t think twice about killing self-driving cars if bad news keeps rolling in.

How is it sad that other automakers will now wait until their own autonomous systems are more complete and safer?

Do you actually want them all beta testing their driving software on the open roads?

Sweet Jesus, yes.

It would still cut down on the statistical probability of drivers and passengers dying needlessly. The fact that they don’t implement less than a level 4 autonomy, shows their fear of litigation is greater, than their concern for their customers lives / welfare. That’s the bottom line.

Musk has already shared data that shows Autopilot, even in Beta– has halved the likelihood of automotive accidents since release. Everyone else is letting their customers die, out of litigious fear:

http://electrek.co/2016/04/24/tesla-autopilot-probability-accident/

The data collection of each Tesla Vehicle improves the entire AutoPilot system. OTA updates, like the pending 8.0 firmware update, improve each vehicle’s capabilities and features without inconveniencing the customer.

What’s to bitch about? Rome was not built in a day. Nor was Autopilot. Let it evolve from real world data / use, and serve a higher purpose than merely sheltering profits from fearful automakers.

Don’t like AutoPilot? Buy someone else’s vehicles and STFU.

How much of AutoPilot’s touted ability to learn and improve through data collection is actually hype as opposed to reality? For instance, Tesla cars must have previously traveled on that Florida highway where the Autopilot fatal crash occurred. Yet Tesla didn’t learn from the data collected by its cars that this particular highway has NO overpasses, and specifically that there is NO overpass at the intersection where the fatal crash occurred. If Tesla Autopilot actually learned for the data collected, the AutoPilot would have realized that the radar was NOT giving a false positive of an overpass, since other Tesla cars had collected information that NO overpass exists at this intersection, and there is an actual, real obstruction at this intersection directly obstructing the path of Mr. Brown’s Model S.

sven said:

“How much of AutoPilot’s touted ability to learn and improve through data collection is actually hype as opposed to reality?”

I don’t know, but I’m sure you’ll be happy to copy and spread any anti-Tesla smear campaign you see on that subject, too. Just like you keep trying to spread the completely untrue smear campaign about the Model S suspension system.

Untrue? That’s premature. NHTSA is still examining the potential suspension issue on the Tesla Model S.

NHTSA is in the “screening” phase of its standard protocol of examining safety related concerns. Despite what Musk tweeted, Tesla is still not out of the woods yet. NHTSA may escalate things to a full recall, or it may choose another action, or simply drop it after it has finished going through info supplied by Tesla, consumers, and other sources.

http://www.hybridcars.com/contrary-to-musks-suggestion-nhtsa-did-not-call-tesla-suspension-complaints-fraudulent/

Jacked Beanstalk said:

“Do you actually want them all beta testing their driving software on the open roads?”

If that software saves more lives than it costs? Yes, absolutely. Every time. All day and all night long.

Why wouldn’t you want the same?

I believe this is actually part of Tesla’s beta testing. One part is how the system of hardware and software functions on the road. The second part is how the driver interacts with the system. The third part is to get a response from lawmakers in reaction to one and two.

Idiots!

I guess they would just turn on cruise control and read a book, or maybe watch a movie. Well…..yeah they would….and did!

AutoPilot is just a more advanced cruise control, that REQUIRES the driver to be alert and drive the car.

Your post needs more caps lock.

Caps Lock is like Cow Bell; you always need more!

CAPS we need CAPS, CAPS and HATS. CAPS, CATS and HATS. lol

Capitalization lock is a substitute for bold and italic, which aren’t available here. He didn’t overuse it.

I feel like now every Tesla driver that crashes because they weren’t paying attention will immediately blame the autopilot feature. I hope teslas logs are extremely detailed.

I’m not sure how they would know if the drivers hands were on the wheel.

Sensors on the steering wheel maybe?

The car senses torque on the steering wheel from the driver’s hands. No torque, no hands.

Tesla has put themselves in a difficult position by telling drivers to keep their hands on the wheel, but not reinforcing it with an alarm or deactivation for several minutes. Obviously they know drivers aren’t keeping hands on the wheel, and that’s a major attraction of Autopilot to their customers, which helps to sell more cars.

Anyone know what speed this incident happened at? Maybe ther was a pothole? I drove a short stretch of this road, in May, Eastbound to Hwy 99, and dit not have any issues with it! Maybe the drivers hands WERE on the wheel, and he sneezed and twitched the wheel to the right, hit the guard rail, reacted by yanking too hard left, crossed over the lanes to hit the concrete divider on the left, the yanked the wheels hard to the right, causing an inertia roll? Having rolled a car myself once, I can say it can happen in under a couple seconds, so, without multiple cameras recording drivers feet on the pedals, hands on the wheel, the road ahead, both sides of the car, and the rear and traffic behind, we will be using cruder means to analyse the cause of these types of claims! Remember, if the Tesla is in Autopilot Mode, and you apply any large controll force: Break, Steer, Accelerate, etc., then Autopilot disengages, similar to what happens when you apply an independant flight control input in a aircraft – the Autopilot Disengages! Once again, I say more owner training is required to use these… Read more »

Heck, how many read the owners manual(s)?

How many people have actually taken a drivers test in the last 20 years. I take a written test every 5 and they assume I’m safe behind the wheel. It’s scary.

Fotomoto and John – exactly!

These ideas that once I press this button, skip the legal yada yada, and hit “Go”, I can just become a passenger, is tantamout to a case of Potential “Murder/Suicide by Ignorance and Arrogance!”

These cars are more complex than a good many light aircraft, and a private pilots license requires 65 hours in flight training, and at least as much time in classroom study and instruction, a few practice exams, a final exam, and both a pre-flight test, to make sure you are ready for the flight test, and the final Flight Test, with an independant examiner!!

Maybe, for the price of these cars, a classroom could be instituted for buyers and pending owners, to get them up to speed on Tesla’s systems, the driving regs, and emergency handling of their vehicles!

Thanks for being the first to rationally comment the article instead of all the Tesla Buu and Tesla Yay comments out there…

Yes i think a autonomous system that is not resposible need high level training. It is very hard to be relaxed while at the same time be continous alterted if everything is correct.

This is the Toyota unintended acceleration freak out all over again.

Except this time with logs. 🙂

I’m beginning to understand the reasoning by some manufacturers saying they want to skip level 3 autonomous and wait till level 4 is fully baked before widespread release on the, ahemm, dimwitted public.

If anyone is interested in what that gentle soul Ed Niedermeyer thinks about AutoPilot following the Florida and Pennsylvania AutoPilot crashes, he had this to say on Bloomberg West: “By branding it as autopilot and sort of surfing this wave of hype about autonomous vehicle technology that Google unleashed, Tesla made it seem like this was real autonomous technology. And I think that, what we’re finding out is that drivers believed it was, when in fact it was really something quite a bit less than full autonomous driving technology.” Gentle soul Ed went on to talk about Elon’s eroding credibility and a shift in the online discussion around Tesla: “I’ve certainly noticed a shift in the online discussion around Tesla… and what you’ve seen in the last month or two is that it’s really accelerated. Tesla’s key asset – which is not really its technology – is Elon Musk’s credibility – and you’ve had a decade really of promises by Musk and Tesla, many of which have not been fulfilled, and then you have this disconnect between perceptions of what autopilot is capable of and the reality of it; what Tesla’s position on safety is, and then the non-disclosure agreements;… Read more »

If late production is a broken promise, then who cares?

What promises, other than delayed production schedules, did Musk make that did not happen?

Seriously, you’re actually posting a long quote from long-time professional Tesla FUDster Niedermeyer? The guy who recently spread an outright, deliberate smear campaign about Tela’s suspension system?

I must say, sven, it’s amazing — nay, astonishing — how in the space of just a year or so, you’ve gone from being one of the most constructive and informative contributors to InsideEVs’ comment threads, to being the #1 anti-Tesla FUDster regularly posting to InsideEVs, with rarely anything positive to say on any subject.

Shame. Shame. Shame.

And no shame on you for your lies and slander against people?

Sven — you’ve hit a new low.

If you are going to be crouched over, sweaty palms tightly wrapped around the steering wheel with eyes straight forward focused only on the road and with one foot hovering over the brake pedal, then you might as well just drive yourself and forget about auto-pilot. The car companies are never willingly going to accept liability. In other words, there is never going to be an accident caused by faulty auto pilot because it says right in the cars manual that you were supposed to be driving. In other words, under auto-pilot the driver must remain hyper-alert because no matter what, it will always be the driver, not the autopilot or the company that made it, that will be blamed for the accident. In that case you are better off to drive yourself and not be under the additional stress and fear of never knowing when the auto-pilot is going crap out and drive you head-on into oncoming traffic, a parked car or a wall. Keep your hands on the steering wheel. Keep your foot right next to the brake. Keep your eyes on the road. That kind of advice sounds a whole lot like regular driving to me. If… Read more »

People have been using cruise control for years to save foot effort. And people have recognized when cruise control is appropriate and when it isn’t.

With ACC, LKA and other driver assistance systems, people simply need to treat it the same way.

yes, but a cruise controll is easy to understand. It’s like i press hold speed and it does so, nothing more nothing less.

But teslas autopilot you can not understand. Like how white must a car in front of me be so that the car still sees it and where does the sun need to be. What colors are not percieved during summer, brown leafes, white snow on the road, …. there are far to many possibilities to know if you are not the developer of the system or a highly trained testing pilot for the system.

Apparently regular old cruise control isn’t as easy or safe as you might believe either.

Apparently people aren’t using it right either:

“The Foundation’s Bernadette Moreau notes that the study isn’t meant to do away with cruise control altogether, but to make motorists aware of when it should — and shouldn’t — be employed: “The idea is not to simply advise drivers to refrain from using these driving aids, which provide real benefits in terms of speed limit compliance and comfort…. However, these aids should not be used systematically, but rather advisedly, and a number of precautions should be taken.””

http://www.csmonitor.com/Business/In-Gear/2013/0812/Is-cruise-control-dangerous

There are plenty of other links out there too warning about the dangers of using regular old cruise control wrong, like on icy roads or on roads with standing water.

But you will probably never hear about any accident where cruise control was involved, because even though experts know cruise control can cause accidents if used incorrectly, those accidents just aren’t news-worthy.

On the contrary, it seems quite easy to understand. Treat autopilot as a form of advanced cruise control. You don’t need to concern yourself about whether the car can detect the white truck in front of you, YOU should see it. And when you see it turning in front of you YOU should apply brakes and take other appropriate action to avoid an accident. Just like with regular old cruise control.

This is why autnomous level 2 is bulls***. Introduce autonmous level 4.

Level 2 – driver is responsible.
Level 4 – the car is responsible and if it can not handle the situation any more it will park at a safe place and wait for the owner to take over again.

I don’t think this has anything to do with autopilot at all, I think they just want to blame someone for the wreck they caused and commit insurance fraud.

This happens so often. People end up on their roof and when asked how they just say “I don’t know”. And when they ask further how fast they were going they’ll reply “the speed limit”. Because otherwise insurance doesn’t pay out.

Also, you can roll cars at as low as 20 miles per hour. Or park cars on top of others for that matter. It’s trivial.

It’s like people discovering how much road they *really* need when they make an emergency stop.

Being sorry for the 500000 deaths per year worldwide and try to improve security further (beyond conventional abs, airbags and automatic braking) is laudable, but it is not something that can be changed overnight for several reasons. First all the cars on the road can’t be replaced at once, there is simply no capital and manufacturing capacity to do that. Second the human brain may not be failure proof but it is still an exceptional combination of hardware and software and driving a car in the middle of traffic just happen to be one task where he performs exceptionally well contrary to appearance. Recognizing objects, crash scenarios and dangers, anticipating a response, taking into account even the mood on another drivers face to determine his next most likely action. That’s all super sophisticated and I think it just happen to be one of the most difficult task that a sentient AI will have to do. Sure the AI will pilot to the millisecond but will it simply be able to forecast that a ball is often followed by a running child like a human driver can. Third, the autopilot up to the selfdriving, if we ever get there without at… Read more »

Agreed.

@ Priusmatic I agree with all your points: 1. The human brain is still amazing and so far there is no known technology to replace it. The human brain is a better driver than a computer chip and code. The problem is that people don,t pay attention while driving like texting on cell phones, etc. 2. It would be a Herculean task to retrofit over one billion vehicles with the technology. It will (if perfected) take many years to implement. 3. To put 35,000 automobile deaths in the U.S. into perspective, there were about 100,000, deaths due to doctor prescribed over-the-counter drugs last year. 4. Forced auto-pilot compliance smacks of fascism where the large corporations and the government teknocrats are allied against the wishes of the people. 5. So-called auto-pilot needs significant work before it will ever to completely take over the driving role from humans. _________________________________________ Cameras for the driver to see behind the vehicle are a huge improvement. Sensors that can eliminate driving blind spots are real improvements If cameras had 365 degree vision virtually all blind spots would be eliminated for drivers. That would be huge. No more little kids accidentally run over when the station wagon… Read more »

Thank you, Priusmaniac, for your most thoughtful and insightful post on this subject.

The transition from fully human-driven to fully computer-driven cars is going to be fraught with difficulties and uncertainties. We certainly do need a public discussion and debate on the implications of that.

What we do not need is a lot of knee-jerk, premature comments made before any confirmed or well-supported evidence is presented; comments based on little more than rumor; comments written without any real thought behind them. Unfortunately we’re seeing far, far too many of the latter and far too few comments as thoughtful as yours.

“Fifth, yes there is one, on a moral ground I would not take for granted, even if it saves lives, that taking drivers freedom to drive away by force feeding autopilot would be a good thing since it takes peoples freedom away”.

This may come as a shock to many Americans but driving is NOT a constitutional right nor a freedom; it’s a privilege granted by the government.

We do have the right to safety from reasonable harm (the whole life, liberty, and the pursuit of happiness thing) which autonomous driving can eventually deliver when or if the gov’t gets fully behind it. In the USA, that means whenever the $$$ start flowing.

“Until the customer responds, we are unable to further investigate.”

Does that mean that Tesla cannot tell whether Autopilot was active, because it cannot remotely fetch the logs, and therefore requires the customer to perform steps to retrieve log data from the car itself? That’s what “we have no data” might indicate, but the wording is vague. Usually Tesla twists words to maximum advantage, so it’s hard to give them the benefit of the doubt.

Indeed, they can’t tell whether Autopilot was on:

“We received an automated alert from this vehicle on July 1 indicating airbag deployment, but logs containing detailed information on the state of the vehicle controls at the time of the collision were never received. This is consistent with damage of the severity reported in the press, which can cause the antenna to fail.”

And then the spin: “Based on the information we have now, we have no reason to believe that Autopilot had anything to do with this accident.”

The plot thickens. . .

There is no plot, only serial anti-tesla haters like yourself running around like chicken little posting your repetitive FUD and nonsense like the well known Tesla-shorter Niedermeyer that you previously linked to.

You so obviously have an agenda here sven (and no life as your constant whining, carpet-bombing comments show).

Get Real said:
“known Tesla-shorter Niedermeyer”

On June 13, 2016, Ed Niedermeyer signed a declaration under penalty of perjury that stated “I never did short, nor do I currently short, nor do I plan to ever short Tesla Motors stock.”

Do you have any proof that Ed Niedermeyer is a Tesla short seller?

http://dailykanban.com/2016/06/declarations-penalty-perjury-re-tesla-motors/

There is no rational reason to believe anything Niedermeyer posts on the subject of Tesla Motors or its cars, anymore than there is any rational reason to believe anything you post on the subject, sven.

Niedermeyer publishing a “sworn statement under penalty of perjury” in an online and highly biased “news” source, like Daily Kanban, is not legally binding… and he knows it.

And I’m pretty sure you’re intelligent enough to know that, sven. So this is yet another case of you posting B.S. in defense of a fellow Tsla basher, innit?

Seems to be a habit of yours. Why is that, hmmmm?

I’m very sure sven has an anti-Tesla agenda for some reason or he wouldnt consistently post often double digit numbers of highly negative posts on these Tesla threads.

Question is, what is driving sven’s carpet-bombing ways?

Concurrent with the thickening of some peoples heads.

The auto companies are going to try to shift all liability for auto-pilot failures onto the driver. In other words the driver is responsible for all accidents even those caused by faulty auto-pilot decisions. In other words the driver will be made liable because: “You were supposed to be watching the road monitoring the auto-pilot in case it screws up.” What ? This just puts a bigger burden on the driver than before, because now he still has to “drive” but he also has the additional worry od o constantly monitoring the auto-pilot software in case of malfunctions and be ready to step in at a moments notice to over-ride the auto-pilot. Even in the best circumstances it’s going to take a few milliseconds to realize the auto-pilot software is not working before you can brake to avoid hitting the car in front of you. By the time you over-ride the auto-pilot and jump on the brake, the accident has already happened. The clear expectation was that the auto-pilot would stop you in this situation because it has done so hundreds of times in the past. This causes you to hesitate, and by the time you figure out that the… Read more »

Your hyperbolic assertions aside, the thrust here is the same as with all accidents that Teslas are involved in.
Namely that the data jibes with what the person involved reports. Since it hasn’t been definitively determined yet as to role of autopilot in this matter, then perhaps JUMPING TO CONCLUSIONS shouldn’t be the only form of exercise here?

The nice thing about this feature, is that like many things in the car, it can be disabled. Don’t like the way it’s implemented or its operational parameters or constraints, turn it off. That’s the nice thing about choice. You can wait until these systems operate in a manner to your liking, then purchase THAT car.

I’m going to reserve judgment on autopilot in this instance, until I get more info. I recommend the same rational course to others(except for the usual bashers that is).

All responsibility for auto-assist failures that cause accidents, injury or death will be strenuously denied by the auto companies: Mercedes, Nissan, Volvo, Tesla, Ford, and whoever else has it installed on their vehicles.

Since the technology is not exclusive to EVs but includes ICE vehicles and manufacturers as well, there would seem to be room for discussion of the technology apart from Tesla’s particular version of it.

At some point in time somebody’s auto-pilot will fail. Maybe that will happen to someone driving in a Mercedes using the Daimler version of the technology.

Finally a decent and non-sensational article on the first Tesla crash while on auto-pilot:

https://www.washingtonpost.com/opinions/the-tesla-didnt-really-crash-itself/2016/07/04/88756584-3fc3-11e6-84e8-1580c7db5275_story.html

7150 miles on my Tesla MS 90D.. 95% of those miles driven with neither AP or cruise control… frankly..I can see no benefits to either in most all my driving whether daily or trip.. there is just too much traffic on the roads to be playing games with shaky technology… I love reading comments from a bunch of A holes that don’t even own or lease a TESLA.. you guys and gals are pathetic. .

AutoPilot is not fully autonomous because the system is not reliable. Why people are treating it as reliable (fully autonomous) is beyond me.

Perhaps AutoPilot should be renamed to Co-Pilot while it’s still in Beta. When failures like this happen, it’s because both the driver and AutoPilot (as a team) failed.

Then it would require different training and should not be made available to untrained users.