Tesla Won’t Disable Autopilot – Will Focus On Further Educating Owners

JUL 16 2016 BY STEVEN LOVEDAY 40

CEO Elon Musk has confirmed that Tesla will not turn off the cars’ Autopilot option due to the recent fatal crash. Instead, the company plans to further educate owners on the system using a blog post.

Tesla Model S On Autopilot

Tesla Model S On Autopilot

Tesla’s autonomous system is more involved and active than other comparable systems on the market. Musk advocated early on for the timely release of the Autopilot technology, knowing that, above all, it would save lives. He believes that people just don’t understand it completely.

The National Highway Traffic Safety Association (NHTSA), along with the National Transportation Safety Board (NTSB), are both investigating the recent fatal Tesla Model S Autopilot crash.

The NHTSA sent a letter to Tesla requesting all details on the incident. Tesla publicly confirmed receipt of the letter and compliance. Also included in the correspondence with Tesla was a questionnaire regarding the Autopilot system itself and all other incident reports related to it.

Tesla answered initial questions and provided information, but there is more to come at a later date. The company noted the fatal crash as the first in 130 million logged Autopilot miles. Tesla blamed the automatic braking system for not engaging as the sensors failed to notice the white truck against the bright sky.

At least two other reports have surfaced since the fatal accident. Both drivers insist that Autopilot was on. Fortunately, the incidents were minor.

Tesla Autopilot - Autosteer

Tesla Autopilot – Autosteer

One involved a Michigan man that hit a guardrail on the Pennsylvania turnpike. He was hospitalized for a few days. The man claimed that the vehicle was in Autopilot, however, Tesla doesn’t have or hasn’t released such information. The driver has decided not to speak of the crash until Tesla or the NHTSA provides more information. The driver has been cited for careless driving.

The other crash report confirmed a Model X hitting a railing in Montana. Tesla noted that autosteer was on and that the driver did not place his hands on the wheel, despite warnings to do so. Tesla does not support the use of autosteer at high speeds or on non-divided roads and considers it a violation of a driver’s “terms” if hands are not placed on the wheel regularly and despite warnings.

Tesla has made repeated claims that the system is a “beta” system and that such information is made blatantly clear to drivers. The system is not on unless a user chooses to turn it on. Once the system is engaged, multiple disclaimers are provided. Musk said:

“It says beta specifically so people do not become complacent.” (Disclaimers are) “written in super plain language.”

Source: WSJ

Categories: Tesla

Tags: , , ,

Leave a Reply

40 Comments on "Tesla Won’t Disable Autopilot – Will Focus On Further Educating Owners"

newest oldest most voted

I don’t believe any amount of education will prevent drivers from paying less attention while on AP, because the brain relaxes in involuntarily in response to a reduced workload, and that cannot be completely controlled consciously. Drivers can read a blog post ten thousand times, it’s not going to overrule a million years of evolved energy optimisation in the human brain.

This happens to also be a big reason why Autopilot prevents accidents: many humans involuntarily pay less attention than they should while driving. But if humans can’t be trained to pay enough attention (at least all the time) after one hundred years of no Autopilot, there’s no way they’re going to improve while on Autopilot, which tempts the human brain to disengage infinitely more than full manual driving.

Furthermore, I think researchers will eventually conduct a PET scan of, “this is your brain on Autopilot”, which will confirm this effect. Hopefully sooner rather than later.

How many people do you think actually read the owners manual when they buy or drive a car they don’t own? Not very many I bet. The auto manufacturers may have to start operator certification programs for autonomous features and make sure the people with those certifications are actually driving the cars before the autonomous features are allowed to be engaged.

FYI, they went over its proper usage during my long in-vehicle orientation when I bought the Model X. In fact, I also had to read and state I understood the on-screen dialog text when I enabled the AutoSteer/Pilot feature (they could not touch the button themselves).

Yup. Tesla makes the owner or driver opt in for AutoPilot features such as AutoSteer (Beta). By default, they’re turned off.

And the warning screens that you have to read — or at least indicate you have read, whether or not you actually did — should leave the driver with no illusions that Autopilot/AutoSteer (Beta) makes the car fully self-sufficient:

Did you have to take a test showing you knew how to use Autopilot? Was there a chance that you would not be allowed to use Autopilot if you failed the test? Without the test and the chance of losing Autopilot privileges the Autopilot instruction could have been easily circumvented and appears to have been circumvented on a regular basis.

Was there a test for cruise control when it came out? After all it could run into the back of a car since it wasn’t auto adaptive to distance yet like todays. Is there a test where you can use your phones NAV (etc apps) and drive at the same time? What about using some of the clumsy NAV systems that are currently in cars (Gen 2 Volt comes to mind) or other UI features. Absurd right? Major accidents from users of phones every single hour driving cars and no one is up in arms. Absurd right?

Those other systems you mention were never intended to reduce accidents or save lives. When a system such as Autopilot that are specifically intended to reduce accidents and save lives is actually causing accidents then it’s not in the least bit absurd to require operator certifications for these systems.

Four Electrics said:

“I don’t believe any amount of education will prevent drivers from paying less attention while on AP, because the brain relaxes in involuntarily in response to a reduced workload, and that cannot be completely controlled consciously.”

Absolutely correct. We can’t change human nature.

This dilemma will be resolved not by legislatures outlawing semi-autonomous-driving software, nor by courts assigning liability to the manufacturers of semi-self-driving cars. The courts and lawmakers move very slowly, far faster than the tech is evolving.

This dilemma will be resolved by the rapidly advancing capability of Tesla Autopilot, and similar systems from other auto makers; advancing to the point that only Luddites and permanent Tesla bashers will doubt that everyone on the roads is safer with Autopilot/AutoSteer fully engaged in Tesla’s cars.

Everyone will be safer, including those in all the other cars on the road.

It won’t change human nature. It can change human behavior.

Don’t underestimate the power of education.

What the court will say about a driver over speeding watching Harry Potter?

Exactly.

And those who call the Autopilot victims “idiots” should keep in mind that many accidents include two or more vehicles and people other than the one at fault can die. What if some “idiot” takes a nap with Autopilot engaged and his Tesla plows through a dozen kids at a bus stop?

What about texting and driving IDIOTS. Any solution different to ban texting future from cellphones?

I think they definitely need to be more insistent that hand(s) are to remain on the wheel not just within some distance. You can instantly feel if something is not right much faster than you can see and react. I’ve felt this exact thing as I looked at the internet radio options as an example.

I drove with at least 1 hand and more often than not both hands on the wheel and I still get the initial minor visual warning. It really wants some tension on the steering wheel and for you to be ‘actively’ engaged for it not to display messages that progress to more and more annoying/noticable methods.

I probably have 7000 miles of AutoSteer/Pilot under my belt. 60% divided highway and rest mixed from rural roads to suburb 45 MPH roads (in the latter I have both hands fully on the wheel and I’m doing a training to the cloud type driving).

turning off the autopilot feature would cost tesla money. instead, elon musk is relying on the fine print in the sales agreement. that’s not a particularly reliable strategy to protect tesla from potential litigation.

the other strategy that tesla seems to be employing is that autopilot doesn’t work unless the driver takes an active step to engage it. but the driver would not have been able to engage the feature had tesla not *sold* it to the driver. here it seems as though tesla is trying to take imply that the autopilot feature was sold: caveat emptor.

can you imagine general motors deploying a feature in the manner that tesla has deployed the autopilot feature??? i can’t imagine such a thing.

judging from the article, it sounds like the michigan driver who had the accident in pennsylvania has an attorney.

He likely got the attorney right away and explains why Tesla *just* got physical access to the car in the past 36 hrs. I don’t think the driver realized the logging going on or how detailed it was. He ignored the ever more visual and audio messages for ~40 secs. He took manual control finally, then hit a rail, accelerated 42% and then crashed. Various articles and tweets on all this for you to self-verify. Hard to argue with raw electronic data.

Too bad if he’s stupid enough to go to court he will loose. Another liar.

It was in Europe for the most part they don’t look to profit from their mistakes.

a) Elon Musk ‏@elonmusk 20h [20 hours ago from 18:19 7/15/2016] @DanistopMe only just gained access to physical vehicle Elon Musk ‏@elonmusk 16h [16 hours ago from 18:19 7/15/2016] @pfierens Logs were downloaded for NHTSA and NTSB. Identical copies to all. b) Related to the PA crash. 11 seconds is a fairly long time. Count one-thousand-one, one-thousand-two … one-thousand-eleven and consider that time as a driver. A Tesla spokesperson has told Jalopnik: We got access to the logs. Data from the vehicle shows that Autosteer was not engaged at the time of this collision. Prior to the collision, Autosteer was in use periodically throughout the approximately 50-minute trip. The most recent such use ended when, approximately 40 seconds prior to the collision, the vehicle did not detect the driver’s hands on the wheel and began a rapidly escalating set of visual and audible alerts to ensure the driver took proper control. When the driver failed to respond to 15 seconds of visual warnings and audible tones, Autosteer began a graceful abort procedure in which the music is muted, the vehicle begins to slow and the driver is instructed both visually and audibly to place their hands on the wheel. Approximately… Read more »

it seems as though tesla is trying to take imply that the autopilot feature was sold: caveat emptor.

Hmmm, no. Caveat emptor means “buyer beware”, and is a warning that products may not live up to the claims of advertising or packaging.

Tesla clearly labeling AutoSteer as “Beta”, requiring opt-in to enable them, and including warning screens about the limitations of AutoSteer, makes it a case of “use at your own risk”. Sure, it’s a legal position. It’s also a rational and common-sense approach to warning drivers that this isn’t intended to make the car fully self-driving.

Let’s not forget that **every** time you enable autosteer/pilot that it puts up a message telling you to keep your hands on the steering wheel. So a user can’t say they forgot or didn’t see the original note (ie. spouse saw it) which tells them to keep their hands on the wheel.

“no comment” commented:

“…can you imagine general motors deploying a feature in the manner that tesla has deployed the autopilot feature??? i can’t imagine such a thing.”

I can’t either. Tesla is the “young Turk” pushing the technology forward in these disruptive tech revolutions (both the EV revolution and the autonomous car revolution), and GM is one of the old dinosaurs holding things back.

To bad he’s ging to lose

I think it’s telling that Google tried this experiment and failed: they trained their test employees extensively, and they still did amazingly stupid things when the car was driving. They trusted it way too much, even subconsciously. This let Google to abandon their Autopilot like system in favor of a fully autonomous one, without even a steering wheel.

electric-car-insider.com

Judging by numerous Youtube videos, including one by Elon Musk’s wife Talulah Riley (since taken down) more education is needed. Willful inattention is a different problem than complacency.

Evaluating whether any given driver is able to maintain vigilence and engagement in driving doesn’t require a pet scan. There are eye tracking technologies developed for driving that accomplish this quite well. Monitoring sharp steering wheel corrections, or overcorrections can also work. Mercedes has developed this technology.

Tesla will educate owners plus it will continue to make autopilot better and better. It is also probable that people using autopilot have heard of this fatality and will use the system more responsibly. I believe Elon’s approach is the fastest way to full autonomous vehicle’s, which will save many more lives than they take.

Tesla’s approach is a dead end for full autonomy. They are doing the basic robotic building blocks only. An AI design not a chips-n-code design is needed.

what does “an ai design” mean? i mean, stuff is ultimately implemented in chips and code.

Good move Tesla. Better to educate owners instead of removing a feature that helps many drivers.

I lean libertarian. If someone wants to risk their life doing something stupid, fine. If a manufacturer wants to sell equipment to enable said stupidity, I have no problem as long as they disclose the risks. But that’s not our world. If you tell an employee to monitor a hazardous industrial process and give him a comfy, leather recliner with vibro-massage and headrest speakers, OSHA will ream you a new one. And no amount of “operator education” or caution signs will save you. Guess what? A driver’s seat is a leather recliner with vibro-massage. Unlike the industrial worker, however, a Tesla driver can kill me. Or my newly-licensed daughter. So forget libertarianism on this one. I realize Musk builds companies by ignoring naysayers. But this is one time he needs to listen. Google, Ford, Volvo and others who actually test various approaches all agree: letting the car drive until something goes wrong then requiring the human to take over in an instant DOES NOT WORK. Level 2 technology can improve safety by using the opposite of Tesla’s approach. Make the HUMAN drive the car and have the COMPUTER constantly monitor, sounding the alert when it senses danger and even taking… Read more »

It is funny that you believe your post about liberalism somehow proves Tesla is wrong. But what it actually proves is the core failure in Liberalism as a philosophy.

The failure is that when it is you and your children, it becomes “Forget Liberalism”. But when somebody else doesn’t want their children harmed by something you don’t care about, it is back to liberalism, screw them.

What about MY children, that I want protected from being killed in an entirely avoidable accident, the kind that Autopilot has ALREADY repeatedly prevented? Are you the only person who’s children matter?

Liberalism is most certainly not Libertarianism. A few letters makes all the difference.

My opinion is that if you lean at all, the government leaves Tesla alone. It is very easy to rationally see the endpoint.

David — ah, yes! Thanks for the correction. I definitely typed in the wrong word. More than once. Very embarrassing, but you clearly knew what I was talking about. So that’s good.

that reminds me of the declared-republicans who opposed obamacare until they were personally impacted by a serious health emergency. then they were giving testimony in favor of it.

society doesn’t work if it is all just driven by “me, me, my, my” libertarianism, which inevitably degenerates into social darwinism.

It’s a blatantly obvious legal dodge for Tesla to “require” that you keep your hands on the wheel for a feature designed to steer the car for you.

I’m waiting for someone to explain what possible purpose it serves to have your hands on the wheel during AutoSteer OTHER THAN to provide plausible deniability for Tesla.

The purpose is to keep Tesla’s beta system from killing you.

The problem is if you use autosteer as directed, it provides zero benefit. In fact, it increases your workload. With eyes on road and hands on wheel, steering is automatic – a subconscious process. Monitoring whether betapilot is working properly requires conscious thought.

This article touches on these issues:
http://www.slate.com/articles/technology/future_tense/2016/07/is_tesla_s_style_of_autopilot_a_bad_idea_volvo_google_and_others_think_so.html

Could the plaintiff (given from michigan) be GM / other auto staff?

Tesla has the data and should publish it.

In the all miles:
How many hazards were avoided?
How many warnings were given?
How many times was EAB activated?
Mapping passive data against driver actions, how many times would AP do a better job?

Mine the data Elon. Prove Tesla claims that AP and EAB drives better than humans.

not to worry, it will all come out in the inevitable lawsuits that are sure to follow.

The future is grim for self driving cars, there will be more idiots, with ever better ideas of how to miss use autopilot, but only Tesla is the focus. Texting and driving are among the highest number of deaths, however no one is talking about banning cell phones in cars. Is hard to believe that the progress of technology depends of the imagination of idiots.