Tesla Model X Pennsylvania Crash – Autopilot Not On, Musk Tweets Autopilot Would’ve Prevented Accident (Update – log info)

JUL 15 2016 BY ERIC LOVEDAY 80

Albert Scaglione was in his 2016 Tesla Model X when it crashed and rolled over on July 1st on the Pennsylvania Turnpike.

The Detroit Free Press was able to get in touch with officer-on-the-scene Dale Vukovich of the Pennsylvania State Police. The officer described the incident as follows:

“Vukovich stated that Scaglione’s car was traveling east near mile marker 160, about 5 p.m. when it hit a guard rail “off the right side of the roadway. It then crossed over the eastbound lanes and hit the concrete median.”

“After that, the Tesla Model X rolled onto its roof and came to rest in the middle eastbound lane. A 2013 Infiniti G37 driven in the westbound lane by Thomas Hess of West Chester, Pa., was struck by debris from the Scaglione car, but neither he nor his passenger was hurt.”

Scaglione believed that his X was in Autopilot mode during the time when the vehicle crash.

Tesla almost immediately issued a statement on the matter and then issued this updated statement, which dispute the Autopilot claim:

“We received an automated alert from this vehicle on July 1 indicating airbag deployment, but logs containing detailed information on the state of the vehicle controls at the time of the collision were never received. This is consistent with damage of the severity reported in the press, which can cause the antenna to fail. As we do with all crash events, we immediately reached out to the customer to confirm they were ok and offer support but were unable to reach him. We have since attempted to contact the customer three times by phone without success. Based on the information we have now, we have no reason to believe that Autopilot had anything to do with this accident.”

Well, now that Tesla has gained actual physical access to the car involved in the wreck and has been able to go through the vehicle’s logs in detail, Elon Musk is confident that Autopilot wasn’t in use during the time of the crash:

Elon Musk Tweets About PA Model X Crash

Elon Musk Tweets About PA Model X Crash

And for some reason, which isn’t clear to us due to lack of detail in Musk’s Tweet, the Tesla CEO seems extremely confident that if Autopilot had been activated, then this crash wouldn’t have occurred.

How he came to that conclusion is a mystery to us, but we don’t have the data to pile over, nor would we have the knowledge to obtain any usable info from this data, so we suppose we have to take Musk at his word on this one.

Update (July 15th) …and now we do as a Tesla spokesperson has given Jalopnik a full statement:

We got access to the logs. Data from the vehicle shows that Autosteer was not engaged at the time of this collision. Prior to the collision, Autosteer was in use periodically throughout the approximately 50-minute trip.

The most recent such use ended when, approximately 40 seconds prior to the collision, the vehicle did not detect the driver’s hands on the wheel and began a rapidly escalating set of visual and audible alerts to ensure the driver took proper control.

When the driver failed to respond to 15 seconds of visual warnings and audible tones, Autosteer began a graceful abort procedure in which the music is muted, the vehicle begins to slow and the driver is instructed both visually and audibly to place their hands on the wheel.

Approximately 11 seconds prior to the collision, the driver responded and regained control by holding the steering wheel, applying leftward torque to turn it, and pressing the accelerator pedal to 42%. Over 10 seconds and approximately 300m later and while under manual steering control, the driver drifted out of the lane, collided with a barrier, overcorrected, crossed both lanes of the highway, struck a median barrier, and rolled the vehicle.

Categories: Crashed EVs, Tesla

Tags: , , , ,

Leave a Reply

80 Comments on "Tesla Model X Pennsylvania Crash – Autopilot Not On, Musk Tweets Autopilot Would’ve Prevented Accident (Update – log info)"

newest oldest most voted

this sounds like a self-serving interpretation from elon musk, who is increasingly losing credibility in his pronouncements because he is a full-time salesperson. there is nothing in the article that even states that there was data from which tesla could draw any definitive conclusions with regard to whether or not the auto pilot feature was engaged. all that musk can said is that “he is confident” that it wasn’t.

They had the data — from above: “now that Tesla has gained actual physical access to the car involved in the wreck and has been able to go through the vehicle’s logs in detail”

read the rest of the article.

no comment said “…that even states that there was data from which tesla could draw any definitive conclusions…
scottf200 points out article text “Tesla… has been able to go through the vehicle’s logs in detail”

They had the vehicle’s logs to draw conclusions from.

“no comment” commented:

“…read the rest of the article.”

He did. You didn’t.

Is your anti-Tesla bias so strong that you’ve lost the ability to read an entire article before attacking?

Maybe you should learn to read before you type.

ok, so insideevs amends the original article with an update that includes additional information that wasn’t present at the time i posted my original comment and the elon musk fanboys are out to declare that i have some kind of “anti-tesla bias”. the fact of the matter is that i don’t operate on the basis of “in elon we trust” as you fanboys do. the original article merely stated a willingness to “take musk on his word on this one”, which is something that i’m not willing to do.

No, dude, you don’t get to wiggle your way out of your extremely anti-social behavior that easily. The addition to the article above is clearly marked “Update”, and the section you failed to read earlier is part of the original article, before the “Update” section.

Not only did you fail to read the relevant part of the article, you actually had the gall to wrongly castigate someone for not doing that, when the fault was yours!

You made your bed, now lie in it.

no comment — If that is the case, it should be very easy for you to simply say:

“my bad, based on the rest of the article, I was wrong”.

It is easy. Try it.

LOL.

Tesla cars are gathering logs. Always.

Tesla autopilot feature is gathering logs. Always. Even when turned off. Logs also include what would Autopilot do in given situations.

Autopilot logs (from on or off period) are sent do Tesla regularly, and used to better teach Autopilot software driving skill.

So Your claim is actually weak on. Tweek is only 140 signs. Lack of the details is to be expected. Level of detail Tesla cars logs is also known.

Update proved You wrong.

It seems pretty obvious at this point that Autopilot is always on in the background. Where the control is active or not is what the user is activating.

This allows Tesla to get any vehicle ‘video’ uploads (a log covering a set amount of time before/after ) any time that the user does something contrary to what autopilot had calculated to do. This can be reviewed to see why and therefore they can continue to improve the autopilot system.

Musk’s confidence that the accident would not have occurred is likely to the fact that on reviewing the logs the autopilot system flagged a difference and stored the event.

What would be interesting to know is how long the autopilot had been inactive before the event occurred. 1 second? 1 minute?

Did the user mistakenly de-activate it, not ever actually activate it, or trying to blame autopilot?

You have NO idea WTF you are talking about and it is clear that you have never been behind the wheel to even GUESS WTF is happening here. For your information, and for the information of everyone else who has become an ‘expert’ on autopilot, the following is what happens in front of you:

In front of you sits your car icon. If the car detects no traffic lines (on which autopilot bases its handling), your car icon is in a blank field. When the car detects a line to your left and/or your right, it displays them next to your car icon. ONLY AFTER the car establishes road paint does an icon appear that permits engagement of autopilot (the icon looks like a steering wheel). When you see that gray icon, you can then pull back TWICE on the stick that turns on AP. Autopilot then dings to tell you audibly that it is engaged and the gray icon turns blue.

To state that AP is ‘always on’ shows that you are totally clueless about how the system operates and should not be stating to others that what you think is ‘obvious’.

Stop being a twit. I took the original message to mean that the Autopilot software is always running and logging data whether or not the user has activated AP. Obviously, unless the user has activated AP, the car won’t try to drive itself, just log data. This has long been suspected, but never confirmed by Tesla.

the autopilot system doesn’t need to be active for the car’s logistics to work. That data is collected regardless.

Re: ‘ … Autopilot is always on in the background …’
It appears so because it is always analyzing whether it can display the lane markings and steer wheel icon in grey … which means is auto-pilot/steer currently *able* to be turned on (double pull of lever). If the steering wheel icon is not present then you need to wait for it to show up (i.e. system finds it able)

Poppy — It sounds like you are confusing what AutoPilot is doing, and what normal and typical Data Event Recorders (AKA automotive Black Boxes”) do in every vehicle they are in.

Since Sept. 1, 2014, EVERY SINGLE CAR sold in the United States has been mandated to have an event recorder like what Tesla is reporting.

It is mandated by law for these event recorders to record throttle, steering, speed, etc.

I’m sorry you are not familiar with automotive black boxes. The only question now is how you will respond now that you have been informed.

“…the autopilot system flagged a difference and stored the event.” Hmmm, well I could be wrong, but I don’t think that’s how it works. As I understand it, Tesla’s cars use their cameras to “look” at the road, and try to spot the lane markings painted on the pavement. If Autopilot/AutoSteer is activated, then it will attempt to keep the car centered in the lane, unless the driver overrides this by using the steering wheel. I’m not sure just how Tesla stores the data, but I think according to what at least one person said, the cars store a certain number of minutes of the latest camera video, recording it internally. Tesla would then, upon gaining access to the car, be able to download this video; video which would show what the car’s cameras saw for the few minutes leading up to the accident. Running the data thru another copy of the Autopilot software, Tesla engineers/programmers should be able to tell if — in theory — AutoSteer would have been able to see the edge of the lane, and — again in theory — would have been able to keep the car from impacting the rail, thus avoiding the accident.… Read more »

Elsewhere it appears that autopilot went off ten seconds before the crash. That’s not exactly unrelated, though it looks like the driver fell asleep at the wheel which is obviously a good way to crash.

Question is: would he have fallen asleep (very likely what happened) at the wheel if Autopilot wasn’t an option to use (or forced you to keep your hands on the wheel at all times)?

Autopilot wasn’t on when the crash happened….because it turned itself off 15 seconds prior! So Elon saying Autopilot would have prevented the crash is just a complete warping of the truth.

http://jalopnik.com/musk-autopilot-was-off-in-pa-tesla-model-x-crash-acco-1783695454 A Tesla spokesperson has told Jalopnik: We got access to the logs. Data from the vehicle shows that Autosteer was not engaged at the time of this collision. Prior to the collision, Autosteer was in use periodically throughout the approximately 50-minute trip. The most recent such use ended when, approximately 40 seconds prior to the collision, the vehicle did not detect the driver’s hands on the wheel and began a rapidly escalating set of visual and audible alerts to ensure the driver took proper control. When the driver failed to respond to 15 seconds of visual warnings and audible tones, Autosteer began a graceful abort procedure in which the music is muted, the vehicle begins to slow and the driver is instructed both visually and audibly to place their hands on the wheel. Approximately 11 seconds prior to the collision, the driver responded and regained control by holding the steering wheel, applying leftward torque to turn it, and pressing the accelerator pedal to 42%. Over 10 seconds and approximately 300m later and while under manual steering control, the driver drifted out of the lane, collided with a barrier, overcorrected, crossed both lanes of the highway, struck a median barrier,… Read more »

Thanks for the info! That should put this to rest. Clearly driver fell asleep and the car awakened the driver with audible alerts. Driver took back control of the vehicle with autopilot now off. Driver then under manual power went back to sleep and subsequently crashed.

What’s a more preferable scenario?

Admit you fell asleep, caused a crash, damaged public property and also caused damage to another vehicle and possible medical injuries? That’s a lot of pain to swallow and lots of bills and most likely a lawsuit.

Of course, it would be better to say the car was on Autopilot. When the facts/logs disprove that, then go on to say “I thought it was on Autopilot”. Then the focus is on Tesla and not on the IDIOT driver.

Even if only the Tesla was damaged, these idiotic drivers will still blame the car. After all, who wants to admit to trashing their $150K car? Their insurance may not pay and their rates will skyrocket.

This is a very helpful discussion and provides insight into something I have thought about since I was a child- news reports of accidents always state ‘lost control’ and we never learn what that vague phrase means. As a child, I used to think it meant that the brakes had failed or something of that nature. I used to picture a driver, perfectly sober and in full control of his faculties. Of course, many of these reports are never able to tell us anything more because the driver is dead. These event recordings are the most vivid description I have ever read and help to paint a terrifying picture of a driver falling asleep at the wheel, the vehicle responding, the guy waking up in panic and then totally incapable of controlling his vehicle. Of course, in the past, the guy would not have woken up at all until he was in the hospital, with zero memory of what happened. I think Musk’s comments are bold, to say the least, but I can definitely see how AP would certainly be a mitigating factor for a driver who falls asleep at the wheel. On my 1400 mile trip in my X,… Read more »

I love the analogy that horse-drawn carts were essentially “autopilot-enabled” and would avoid many accidents that cars suffer from…

I dunno man.
I’ve seen a lot of movies with runaway horses (unintended acceleration)……lol

Blinders are a head worn device meant to stop horses from being too easily “spooked” by visual stimulus and acting erratically. Loud noises, also could drive horses into uncontrollable behavior.

Every AI has a weakness…

Cafeine will never do it as well as a 15 minutes nap.
Ask me how I know!

“…we need to go back to horses!”

🙂

flmark, thank you for your well-written and thoughtful comments.

I seem to recall reading that back in the days when the motorcar was a New Thing, that one argument from the “Get a horse!” crowd was that a horse pulling a buggy/wagon would likely know the way home, so if you fell asleep (or got too drunk to drive), you’d be taken home anyway.

Every mode of transportation has its advantages… and disadvantages. That certainly includes the Tesla Model X, with or without Autopilot/AutoSteer activated.

It makes no difference either way. If a driver falls asleep because they’re relying on autopilot and they’re unable to wake up when autopilot prompts them to take control, the driver is at fault. How could it be otherwise? It’s not as if the car doesn’t bug you enough to always be holding the wheel and alert, is it?

“would he have fallen asleep (very likely what happened) at the wheel if Autopilot wasn’t an option to use”

Are you seriously arguing that people without AutoPilot don’t fall asleep and wreck their cars?

Of course he certainly could have fallen asleep regardless of whether he had AutoPilot or not.

Apparently he thinks Tesla and Autopilot are to blame because, I dunno, maybe because the car didn’t blast the driver’s ears with loud heavy metal music, or jolt the driver with electricity, to make sure he stayed awake after falling asleep at the wheel?

People like that, people who don’t believe in people taking responsibility for their own actions; some of them get onto juries, and that’s one of the reasons why we have so many frivolous lawsuits in the USA.

There appears to have been several crashes when the driver was confused about whether or not Autopilot was active (and not just attempting an excuse afterwards). Seems like some more idiot-proofing is in order there..

Yeah, the entire dashboard should be lit up differently when auto pilot is enabled, so when it become disabled, it is immediately obvious.

+100
The entire center console should be in a different color when AP is running and turn back to a black background when it’s disabled. This is the 3rd? crash that’s happened when the driver thought AP was engaged and it wasn’t.

center console = “instrument cluster”

Foo said:

…the entire dashboard should be lit up differently when auto pilot is enabled, so when it become disabled, it is immediately obvious. [emphasis added]

Foo, that is the first positive suggestion for improvement that I’ve seen in all the discussions of Tesla Autopilot/AutoSteer.

Did you think that up yourself? If so, you deserve a huge heap of congratulations. And if you read it somewhere else, thanks muchly for repeating it here.

In any case, I fervently hope that Tesla gets this idea passed along to them!

I agree With Musk.. One has to be driving like a TOTAL Disregarding Careless Idiot with a Death wish to Hit a guard rail and flip that car 2 times !…That vehicle is almost impossible to flip . The car saved his Careless life….I hope he gets charged, Suspended from driving & fined for endangering other people’s lives as well!

We had an “Art Gallery” here in town, long ago. Turns out, it was actually a front for selling drugs, and was shut down by the police.

Art Sale Margins must be pretty good for Scaglione to afford a Model X…

So Autopilot was off for a whole 10 seconds before the crash. Tesla in their statement to Jalopnik fails to mention whether the driver had his hands on the steering wheel and was applying leftward/rightward torque to steer during the 10 seconds after Autopilot disengaged and the car drifted out of the lane and collided with a barrier, or whether the driver took his hands off the wheel thinking Autopilot was still piloting the car. If I’m understanding this correctly, two things can happen after Autopilot nags you to hold the steering wheel and you respond by grabbing a hold of the steering wheel: 1) Grab the steering wheel an apply a slight turning pressure. This tells Autopilot you are awake and reverts back to Autopilot driving the car. 2) Grab the steering wheel an apply more force than slight turning pressure. This tells Autopilot that the driver will be piloting the car and disengages Autosteer and Autopilot. What happens during the “gentle abort procedure” when the driver grabs a hold of the steering wheel? Will the Autopilot always disengage and return to manual steering/driving, or will it resumes normal Autopilot operation if the driver applies only a slight turning… Read more »

The driver is a Darwin Award winner.
The car did not detect his hands on the wheel.
Note the instructions tell you they should always be on the wheel and be alert.
It started visual and audible alerts and escalated them.

Dude was obviously in REM sleep. I bet if there was audio recording in the cabin his ass was also snoring.

Not quite sure what you’re trying so hard to dig for.

I doubt he was in REM sleep. He had a passenger in the front seat and was probably talking to the passenger during the trip.

The statement by Tesla’s spokesperson to Jalopnik was vague as to what the driver did after he grabbed a hold of the wheel for those 10 seconds until impact. Did the driver think Autopilot continued to control the car and took his hands off the wheel again. Or did the driver think Autopilot shut off and kept his hands on the wheel to steer the car.

If the driver thought that when he grabbed the wheel and it shut off the alerts that Autopilot stayed on and continued to control the car, that might explained his statement to police that Autopilot was on.

If the driver wasn’t holding the wheel and making steering inputs during the last 10 seconds before impact, that would imply that the driver thought Autopilot was still on. Tesla statement to Jalopnik make no mention of whether the driver was holding the wheel or making steering input (applying right/left torque).

WHAT. DOES. IT. MATTER.

When will you people understand that the DRIVER IS RESPONSIBLE at ALL TIMES? It does. not. matter. what. mode. the. car. is. in. The DRIVER is accountable for losing control. That’s HOW IT IS. In most countries, at least. America? Maybe not so much. That’s the land of “blame anyone but myself”

Poorly designed Autopilot software can be a defective product, making Tesla PARTLY RESPONSIBLE. Read the story in the link below about how Tesla could be sued and PLEASE. STOP. YELLING.

http://www.bloomberg.com/news/articles/2016-07-15/tesla-won-t-be-able-to-put-crash-defense-on-autopilot

Star Trek actor Anton Yelchin recently died when he put his Chrylser SUV’s poorly designed gear shift into Neutral instead of Park and was struck and killed by his runaway SUV. The SUV had been recalled because its confusing gear shift had already caused many injuries from runaway SUVs inadvertently left in neutral. Chrysler will be sued by Yelchin’s family in a wrongful death suit, even thought he was responsible for putting his SUV in neutral.

https://www.yahoo.com/movies/star-trek-actor-anton-yelchin-killed-car-hits-070231328.html

Well, sven, I see your TES* reality distortion goggles are working at full power today.

If they weren’t, then perhaps you’d realize that there is a fundamental difference here: Using Autopilot/AutoSteer is an OPTION for which Tesla car drivers HAVE TO OPT IN by pushing buttons on the car’s interface, and when they do so, they have to acknowledge that YES they understand they are responsible for the car’s actions when under Autopilot and AutoSteer.

Contrariwise, in the car Anton Yelchin was driving, using the gear shift control wasn’t optional. It’s required for operating the car, period.

BTW — Sometimes YELLING IS APPROPRIATE when refuting the kind of Tesla-bashing FUD you keep shoveling out, almost every single day on InsideEVs… all of which is total B.S., and you know that it’s total B.S.

*Tesla Envy Syndrome

Putting TES* (Tesla Envy Syndrome) reality distortion in every other comment you make is LAME. Perhaps you suffer from ICE Driving Guilt Syndrome. You feel so guilty for driving an ICE instead of an EV, that you overcompensate by going on EV forums to troll actual EV owners and act like an obnoxious prick.

FYI, it doesn’t make a difference where in a product liability lawsuit whether the feature was optional or mandatory when determining if a defect exists and determining liability.

You’re not just an angry little troll, you’re also the biggest and most obnoxious troll on IEVs. Instead of name calling and accusing everyone of spreading FUD, you should seek physiological help deal with your need to insult people and pick fights with everyone.

Ok sven, is it your contention the software in these level 2 systems are faulty? Or is it your contention that only TESLA’S software is faulty? If it’s the latter, then it’s my contention, looking at your posting history here, that Pushmi-Pullyu’s assertions, though bombastic, are correct.?

Trollnonymous , You Crack me Up ! l m a o…His ass was also snoring ? I believe it !..lol….FUN NY!! …He probably blew huge fart & knocked off the autopilot…l o l….

AutoPilot was off 40 seconds prior:

The most recent such use ended when, approximately 40 seconds prior to the collision, the vehicle did not detect the driver’s hands on the wheel and began a rapidly escalating set of visual and audible alerts to ensure the driver took proper control.

When the driver failed to respond to 15 seconds of visual warnings and audible tones, Autosteer began a graceful abort procedure in which the music is muted, the vehicle begins to slow and the driver is instructed both visually and audibly to place their hands on the wheel.

Approximately 11 seconds prior to the collision, the driver responded and regained control by holding the steering wheel, applying leftward torque to turn it, and pressing the accelerator pedal to 42%. Over 10 seconds and approximately 300m later and while under manual steering control, the driver drifted out of the lane, collided with a barrier, overcorrected, crossed both lanes of the highway, struck a median barrier, and rolled the vehicle.

From 40 to 11 seconds Autopilot was still in control of the car while giving visual warnings and audible tones, then “Autosteer (still in control) began a graceful abort procedure” and slowed the car down while giving more audible and visual warnings.

Approximately 11 seconds prior to the collision Autopilot gave control to the driver when the driver responded and regained control by holding the steering wheel.

“Approximately 11 seconds prior to the collision Autopilot gave control to the driver when the driver responded and regained control by holding the steering wheel.”

No.

1) The driver is ALWAYS ultimately in control. It is a driver assist system, not HAL.
2) Autopilot doesn’t “give control”. The driver TOOK the controls and began operating them. The Autopilot has no choice over whether it “gives control” or not, so cannot “give control”. Again, not HAL.

“The driver TOOK the controls and began operating them”

When the Autopilot starts nagging and the driver touches the steering wheel the Autopilot can either stay on or it can disengage depending on the amount of turning force applied to the steering wheel. Tesla did not provide any info on whether the driver was making any steering inputs in those last ten seconds after the driver “took the controls” until the first impact with the right guardrail. That info would be useful in determining whether the driver mistakenly thought Autopilot was still active and took his hands off the wheel during the last 10 seconds before impact. In other words, did the driver not realize that Autopilot handed off driving duties to him.

With regards to taking the controls, does stepping on the accelerator disengage Autopilot? Stepping on the accelerator while in cruise control does not disengage cruise control.

This Reddit user describes (perhaps more clearly) what I’m trying to say:

ebob5030 Model S 85 • 1d, 11h

“There is an interesting nuance in the way AP works that could help explain what happened here. When you respond to the nags, you need to hold the steering wheel (by applying a slight turning pressure) to tell AP that you are awake. It then stops nagging and resumes normal AP operation. But if you turn the wheel with just a bit more force, it disengages the auto-steer. When you hear the beeping/nagging, it is easy to overreact and turn the wheel enough to disengage without realizing what you just did. It has happened to me and it took a moment to realize that auto-steer is now off. Fortunately I realized it before the car drifted out of the lane.”

https://m.reddit.com/r/teslamotors/comments/4swlvw/detailed_tesla_statement_from_logs_in/

What part of this are you not understanding?! The gentleman had TAKEN CONTROL OF THE VEHICLE! For 21seconds after his initial reaction, HE had control! Then ran the car into the barrier(likely because he had dozed off) of his own accord. NOT because he overreacted to auto-pilot, but because he overreacted to striking the barrier WHILE DRIVING HIMSELF!

Uh, dude, sven is on to something, here. In his apparently groggy state, he could have easily thought that he just satisfied AP’s inputs, and dozed off again. There is nothing in the information reported above that there was any more user input after his initial response to the AP warnings.

Stop asking questions that you really don’t want answers to! If you’re having trouble understanding the information given regarding this incident, go to the horse’s mouth, Tesla, and request answers to specific questions. Asking rhetorical questions, then replying to yourself with affirmations of an invariably negative content, isn’t really answering your questions!

Seems quite clear from the Tesla account that the driver fell asleep, woke up for a few seconds by the warnings and then went back to sleep. I think it’s very possible he didn’t remember that episode at all – people usually don’t remember being woken up in the middle of the night. Seems quite clearly his fault, but does using autopilot increase the chance of falling asleep at the wheel?

The chance is bigger. Many years ago I quit using cruise control during late night driving. Kept me more awake.

(and yes, I do think that testing Beta Software in US market is kind of suicidal for a car manufacturer. Too many lawyers, too many poorly trained drivers. Baaaaaaaad combination.

I do exactly the same thing!

I do the same thing. When you are feeling a bit sleepy, turning off cruise control until I can find a good place to pull off the road and close my eyes is the right choice. I think that is pretty common.

Yet clearly the world didn’t blame cruise control and try to demand it be shut down when some people used it wrong, and fell asleep with the cruise control on.

It’s not the operation of the system itself that is ‘beta’. I believe this is one of the most misunderstood things about this software/ hardware suite!
Tesla’s is one of, if not THE best of the semi-autonomous level 2 systems out right now. My understanding is the “beta” refers to the amount of situational(human vs machine) information Tesla’s gathering, before THEY feel comfortable enough to move to remove the term.

> Based on the information we have now, we have no reason to believe that Autopilot had anything to do with this accident.

That is hard to take seriously. If the driver BELIEVED autopilot was on, that in itself means autopilot had everything to do with the accident.

Clearly the driver isn’t absolved of responsibility either way, but IMO it just appears desperate to respond the way Tesla is now doing. Especially since on EVERY occasion where there’s been a crash and discussion/speculation as to the role of AP the suspicion is that it may have contributed to the driver failing to pay attention.

Saying people shouldn’t behave this way isn’t really relevant if the system IN FACT causes people to behave this way.

I’d like Tesla to show us the data they claim proves that driving with AP is safer than without, so that data analysts and statisticians can critique it. So far, it’s not possible for laymen to conclude either way.

“If the driver BELIEVED autopilot was on, that in itself means autopilot had everything to do with the accident.”

aw well s*** then man, I ~Believe~ I got all the numbers in the Mega Lotto. Where’s my phatass Check????
[̲̅$̲̅(̲̅ιοο̲̅)̲̅$̲̅]

That bill is priceless!!

Well, interactions and assumptions by the driver when dealing with additional functionality (in this case, autopilot) can lead to errand decisions without training and/or experience. That makes the feature a distraction from the purpose of the driver making sure that the destination is reached in one piece.

Just because the driver ‘believed’ autopilot would be on, that doesn’t magically abdicate his responsibility to drive the vehicle. He’s still responsible. What does it matter what he believed? That it was some misunderstanding? “I thought the car was driving for me but I was mistaken!” is just a careless and lazy excuse.

Wait – let me get this straight – Autosteer – a feature that enables the vehicle to stay in the lane by itself – requires you to have your hands on the steering wheel? WHY?!?!?!?!?

The whole point is to relax your arms and avoid fatigue by letting them rest and let the car steer itself where conditions permit!!! Stare out the window and enjoy the clear weather…watch the car maintain the lane…marvel at modern technology…have a conversation with a passenger or over a hands-free device…anything other than steer.

IF the damn car is going to threaten disengagement every few minutes because it does not detect your hands on the steering wheel, then WHY BOTHER?!?!? That completely defeats the purpose of the ability for the car to steer itself!

Am I the only one having trouble with the logic of Autosteer operation?

I fully appreciate the need for some sort of dead-man’s switch in order to ensure the consciousness of the driver, but that can be handled in other effective, but less fatiguing ways.

Autopilot is intended to reduce mental fatigue, not physical, and allow the driver to be more focused on the road and potential hazards. It does this by offsetting some of the brain power that is used to keep straight in your lane, allowing more brainpower to be diverted to scanning the road for potential hazards, etc. The reason hands must be on the wheel is obviously because we’re not at fully autonomous cars yet. They’d be legally stupid to say “go ahead, take your hands off the wheel” wouldn’t they? The reason it is called ‘autopilot’ is just marketing. It isn’t literally intended to do the driver’s job for them. Putting one’s hands on a wheel is hardly exhausting so I fail to see how having your hands on your lap is going to save your arms any energy, all it’s going to do is make it so you have less time to react and grab the wheel in an emergency for which autopilot is incapable of responding to. Also, when you put your hands on the wheel, it doesn’t stop the car from steering. You have to really tug the wheel a good inch or two to the side… Read more »
TomArt said: “Am I the only one having trouble with the logic of Autosteer operation?” No, quite obviously you are not the only person who doesn’t understand that AutoSteer isn’t intended to let you take your attention off the road and talk to your passenger, or (as we’ve seen in videos posted to YouTube) talk to your video camera, or use your cellphone or GameBoy, or perhaps put nail polish on your nails, read the newspaper, or whatever else comes into your mind to do when you should be paying attention to driving. Here is a transcription from an actual screenshot of what Tesla officially says about AutoSteer: ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Autosteer (Beta) Autosteer keeps the car in the current lane and engages Traffic-Aware Cruise Control to maintain the car’s speed. Using a variety of measures including steering angle, steering rate and speed to determine the appropriate operation AutoSteer assists the driver on the road, making the driving experience easier. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Please note the phrase “AutoSteer assists the driver”. It does not say “AutoSteer will drive the car for you”. AutoSteer is intended to perform lane-keeping on a divided highway. That is all it’s supposed to do, and it doesn’t even always… Read more »
I didn’t mean watch a movie – I meant the need to keep in physical contact with the steering wheel. That does not mean that you are clueless as to what’s going on, it simply means that you are not in physical contact with the steering wheel. The whole point is to give your arms a rest when conditions are favorable on long highway trips. You can talk to somebody and still have your eyes face forward (or, as I said, via a hands-free device, if there are no passengers). If it keeps bugging you to put your hands on the wheel, then I cannot fathom a practical use of autosteer. Adaptive cruise control makes it less fatiguing to drive in heavy traffic with erratic speeds. However, none of the systems of which I am aware, require the feeling of pressure of your foot on the accelerator. Nor should they, for the same reason. Of course, your foot is nearby, but not necessarily nervously hovering over the pedal…just nearby, at the ready. It is much more tiring to have your foot hover over the pedal than to actually operate it. Doing so would completely negate the reason for cruise control.

Homer’s autopilot works a little better it appears. Even has auto-park. 🙂

Elon Musk tweeted:

“…Autopilot was turned off… crash would not have occurred if it was on.”

Shame on Elon for making that asssertion. As a trained engineer, he surely knows better than to make an unqualified statement like that, regarding an incident with so many unknown variables.

It would have been appropriate to say “The crash might have been prevented if Autopilot was engaged”, or possibly even “probably would have been prevented”.

But only the Supreme Being* can be sure if the accident would not have occurred if Autopilot had been engaged.

Hmph. First we get a report of Consumer Reports making a statement about Tesla Autopilot unsupported by facts in its possession, and now Elon Musk does the same. Is the tendency to talk out of your arse about Tesla Autopilot contagious? 😉

*I refer, of course, to the Great Flying Spaghetti Monster (bless His noodly appendages!)

With the detailed log/sensor data and likely camera video (x seconds), I would guess they run that in their AutoPilot/AutoSteer simulator to see how it would have worked.

Meh. The Autopilot was in the process of safely stopping the vehicle, until the driver put quite a bit of foot into the accelerator pedal, and steered the car off the road.

Half-throttle for 10 seconds, in a car that could easily accelerate from 0-60 in less than half that time, likely accelerated the vehicle quite a bit.

I’m still interested in an accounting of the speed of the vehicle at each point on the timeline. But if the vehicle was slowing and stopping before the driver took control, it certainly would be easy enough to predict what the vehicle would have done on its own over the next 10 seconds.

Nix said:

“…it certainly would be easy enough to predict what the vehicle would have done on its own over the next 10 seconds.”

No. It would be easy to make a statement about what the car would probably do over the next 10 seconds, but any properly trained scientist or engineer would take care not to overstate the facts in this manner. Heck, even if it was an experiment in a controlled environment, a scientist exhibiting proper healthy skepticism would never absolutely rule out an anomaly or “corner case” happening. Out in the real world, away from the laboratory, all sorts of things could interfere with Autopilot bringing the car safely to a halt. A pothole, an animal on the road, another car veering into its lane, an oil slick on the road… there are an almost infinite number of unusual conditions or events that could interfere with Autopilot doing what it would most likely do. Absolute certainty is the province of the Divine, not us mere mortals… and that goes double for us software programmers! That is especially true when dealing with complex software like the type under discussion.

The software to sense what the car is doing and the black box to say what the car has done, is always running.

Just like in an aircraft.

This isn’t in dispute. Autopilot was disengaged when the vehicle went off the road according to these logs. And according to the logs, autopilot was attempting to stop the car and wake the driver before the driver gave it commands.

The logs would also indicate the car’s sensor data and autopilot output. Collision avoidance is always detecting, even if it can be countermanded by the driver.

Don’t mess with Mikola…

Weird…even if the Autopilot was off, the Collision Avoidance System or at least the Lane Assist should have kicked in to avoid any potential accident. I’m assuming this is standard on any cars with MobilEye’s system installed. Correct me if I’m wrong.

If I may speculate, I believe we are going to see more evidence, backed up by research, that the reason many people find Autopilot relaxing is that the brain partially disengages–much more so than on cruise control. This disengagement would be involuntary and subconscious, occuring to a degree even when the driver wishes to maintain maximum alertness.

Further, that disengagement makes it more difficult for drivers to properly supervise Autopilot and to take over from AP quickly, leading to more accidents when Autopilot disengages.

Statistically, AP could still be a net win, but it would mean that “AP turned off n seconds before this crash” does not automatically disqualify AP from having a contributory role in the crash.

I’d rather drive drunk than trust a computer.

Conservative Logic again?! I’d rather you didn’t drive drunk or use anymore ‘conservative logic’.