Tesla Sending Mixed Message On Autopilot Safety



Tesla Model S Goes Cross Country On Autopilot - Image Credit: Alex Roy

Tesla Model S Goes Cross Country On Autopilot – Image Credit: Alex Roy

It seems Tesla Motors is confusing consumers when it comes to Autopilot capabilities and safety. The company will need to be increasingly clear about its self-driving Autopilot Mode in the future, following a fatal crash that may be blamed on the technology.

Fatal Crash Involving Tesla Model S In Autopilot

Fatal Crash Involving Tesla Model S In Autopilot

While Tesla continues to defend itself and tell media and consumers that the feature is not a replacement for human drivers, other comments, explanations, and videos may be inadvertently making consumers a bit too comfortable.

Director of Google’s self-driving car program, Chris Urmson said:

“Human drivers can’t always be trusted to dip in and out of the task of driving when the car is encouraging them to sit back and relax.”

The recent Autopilot crash in Florida is still being investigated, but no information has been publicized verifying that the driver was distracted or “hands-free”.  It hasn’t been determined if the Autopilot was at fault or not, and that may never be agreed upon, but a DVD player and laptop was found in the vehicle.  The truck driver who hit the Model S said a movies was playing, but Florida investigators said the units were not powered up when police arrived, and a determination if they were active at the time of the crash could not be made.

In the end, it all comes down to whether or not drivers have a false sense of safety and will inherently make bad, “lazy” choices, coupled with who to blame for the accident legally. In Florida, Tesla is probably in pretty good shape legally, based on current laws. Since the government and insurance companies have not yet set parameters, and likely won’t for some time, the situation remains unclear.

According to Tesla CEO Elon Musk, the semi-autonomous, self-driving technology is a “hands on” system. When a driver turns on the Autopilot function, the dashboard indicates:

“Please keep your hands on the wheel. Be prepared to take over at any time.”

However, even though Musk has said that the company is being very cautious, he has also said that the system is about twice as good as a human. Both comments may be true, but people see the statistics and may think it means that the system is foolproof. Some people believe everything that they see and hear on the Internet and TV, and unfortunately, people tend to be lazy.

A popular video on Instagram by Musk’s ex-wife, Talulah Riley, shows her and a friend driving on a busy highway, hands free. Riley even has her eyes closed (note: it has since been removed).

While Tesla’s self-driving technology is likely the best on the road thus far, and has been revered by many, automakers must keep its abilities very clear. Audi spokesman, Brad Stertz said:

“Kudos to Tesla for bringing out the system but you also need to be responsible and clear about what the technology is capable of doing.”

Tesla has used over-the-air updates to regularly adjust the Autopilot features for better safety. There are now speed restrictions, and visual and audible warnings. The car is aware if the driver is not “engaged”. Univerisity of Florida law professor, Lars Noah explained:

“It sounds like (Tesla) they did a fairly good job of designing into the mechanism prompts and reminders about what the deal was, and for whatever reason, this guy was not paying attention.”

Source: Autonews

Categories: Tesla


Leave a Reply

36 Comments on "Tesla Sending Mixed Message On Autopilot Safety"

newest oldest most voted

No mixed message no Clarity needed the law is quite simple if you are behind the wheel of the car you are in control of the car if you run into something it’s your fault.

Daniel, did you actually read the article? I totally agree with you that the driver is responsible but to say there’s no mixed message (whether from Tesla or 3rd parties like Talulah Riley is to have completely missed the point.

This wasn’t a helpful or well conceived post. Tesla has NEVER promoted autopilot as a system that allows you to totally relieve yourself of driving duties. People who post stupid-ass videos of themselves ignoring to road to show off how great Autopilot is are damaging to the brand and its reputation. It’s important to note however that even Talulah is not a spokesperson for the company so their actions shouldn’t be Tesla’s problem. The car is designed to be as safe as possible WHEN YOU FOLLOW DIRECTIONS on how it’s to be used. Autopilot is safe – but sometimes road conditions require human intervention and drivers should be there to guide the vehicle safely at those times. It’s not a mixed message so I’ll paraphrase it “Autopilot has better reaction time and awareness than humans when it’s working, but sometimes it doesn’t work and requires a human to intervene. Don’t be stupid, pay attention and be ready to take over”. See? No mixed messages there.

Sorry but you are totally and completely off base. Tesla has most definitely conveyed the idea that its Level 2 system is a hands off Level 3 system. If this is not the case, then why can you remove your hands from the wheel?

Ridiculous to say it’s a hands on system and then allow people to remove their hands. Simple solution would be to do what MB does with Drive Pilot. If you take your hands off the wheel you get a warning. If you don’t heed the warning the car stops. End of problem.

In order to pretend its technology is better than what many others have, Tesla has been willing to hype its Level 2 system as a Level 3 system, thus endangering its customers and, more importantly, other drivers on the road.

Musk may claim that “the system is about twice as good as a human”, but, at least if you measure only by fatalities, that is demonstrably false:

1. Teslas driven with Autopilot on have logged 130 million miles with one traffic fatality.
2. However, Teslas driven with Autopilot off have logged over *two billion miles* with the same number of traffic fatalities.

That’s a 15x difference in miles driven per fatality. Humans driving Teslas produce fewer deaths per mile than Autopilots driving Teslas.

Unfortunately there’s been quite a bit more than one non-Autopilot death. There’s been several that have made the news (3+) and we certainly don’t hear about all of them, just like how we wouldn’t have heard about Joshua Brown if it wasn’t for the Autopilot connection.

When a comment comes from someone suffering from a long term case of TES (Tesla Envy Syndrome), apparently caused by his short investment in Tesla stock, there’s really no need to bother reading the comment. We know it will be entirely composed of FUD and B.S.

A sample size of 1 isn’t significant, and only a FUDster or an idiot would form a conclusion about the whole based on that sample size.

Model S has been on the road for 4 years. Autopilot has only been activated for a year. Of course there’s a significantly higher amount of non autopilot miles. Especially when you consider the fact that Autopilot is meant primarily for highway driving, so your in town driving is going to add significantly to non AP driving ratio as well. The point you’re trying to make is completely invalid.

It makes no sense to try an make safety stats with one death. You need a large sample size in order to do that.

Note that Tesla does not give figures about Autopilot safety statistics (because it will take years of accident data to obtain any meaningful stats) but of the autopilot’s performance, which is completely different.

Performance is not about safety ! It’s about how well it stays within the requested lane and/or maintain the requested distance to the previous car.

You guys are misunderstanding the notion of statistical sampling.

There are plenty of miles on the car in both modes to get an idea of deaths per mile. If the racked up a trillion miles and no deaths, would you be claiming there’s not enough deaths to conclude if it is safe? LOL

The only mistake in his analysis is not knowing the true number of fatalities in each case, which would allow a meaningful comparison.

Also, to the other post above, can we stop claiming someone is a short seller every time they don’t blindly ignore negative details about Tesla? It really hurts your credibility, not theirs.

Using the miles/death ratio of autopilot is absurd !

It matters very little how many million miles have been driven, it only takes one single event transforms autopilot from being the safest driver (zero fatalities)
to barely better than humans (about 50% more miles per death).

If tomorrow, out of bad luck a tire bursts while driving under autopilot and the car spins out of control and kills a group of pedestrians at a traffic light, the car would suddenly get the worst malies/death statatistic.

This should be a red flag telling you that that miles/death ratio cannot be used as a reliable statistic, and it cannot be used as a proxy for the safety of autopilot.
It cannot be used to tell autopilot is safe.
It cannot be used to tell autopilot is unsafe.

But it definitely will fool lots of people because most people don’t understand how statistics work.
And every ad agency uses and abuses of that fact for publicity.

ClarksonCote said:

“…can we stop claiming someone is a short seller every time they don’t blindly ignore negative details about Tesla?

No, it’s not possible to stop something one hasn’t started.

And if you don’t realize that “Four Electrics” is one of the handful of resident perpetual Tesla FUDsters on InsideEVs, then you haven’t been paying attention.

Of course that’s not proof he’s a TSLA short-seller. There are other motives for spending a lot of time posting FUD about Tesla; perhaps he’s a Big Oil shill, or perhaps he’s politically motivated. But when someone makes Tesla bashing posts indistinguishable from those posted to the stock investor site Seeking Alpha, as “Four Electrics” does, then Occam’s Razor points in the direction of him being just another short-seller.

One thing is certain: He’s not giving his honest opinions.

“It really hurts your credibility, not theirs.”

Hmmm, well, it might be interesting to take a poll of the readership to see what they think of your credibility, after that comment.


“You guys are misunderstanding the notion of statistical sampling.

“There are plenty of miles on the car in both modes to get an idea of deaths per mile.”

Not at all; it’s you who doesn’t understand statistical analysis. That’s unfortunately one of several subjects regarding which the average person thinks he understands much better than he actually does.

It’s easy to show your fallacy. If there were two deaths instead of one, that would be a 100% increase in the rate just from that one data point! And that’s why a a sample size of one (1) is very far from statistically valid.

Altho the various idiotic “Look Ma, no hands!” videos posted in various places on the Internet, showing Tesla car driver grossly mis-using the AutoSteer system are not helping…. At the same time, I think this is blaming the symptom rather than the cause; reversing the causal relationship. The reason that people are tempted to allow Tesla AutoSteer to take over driving more or less completely is because the system makes it possible. And that’s also why those people posting those stupid videos are doing so. If AutoSteer didn’t make that so easy, then people wouldn’t be doing it. While I applaud Tesla for moving boldly forward and deploying semi-autonomous driving features in their cars, at the same time I call on Tesla Motors to place more hard limits on the use of such features where they were clearly not intended to be used. If Tesla has the ability to limit use of AutoSteer to only divided highways, then it should do so, and not continue to allow so many drivers to use AutoSteer even on busy streets with two-way traffic. Also, Tesla needs to make sure drivers are alert and paying attention by forcing drivers to take the wheel periodically.… Read more »

You mean like this guy?


> While Tesla continues to defend itself and tell media and consumers that the feature is not a replacement for human drivers, other comments, explanations, and videos may be inadvertently making consumers a bit too comfortable.

That’s not Tesla’s fault, is it? Plenty of videos for other vehicle makes show people abusing their lane assist features (including sticking a can to the steering wheel, etc).

As far as I can remember, Tesla has always said that you need to keep paying attention. The manual states you should keep your hands on the wheel.

Tesla employees say the same thing, while the top engineers of AutoPilot have said the same thing during conferences/presentations.

This article does not make sense at all, and the misleading headline makes it look like another desperate attempt to ride the Tesla wave for more views.

I’m not asking for this site to be a Tesla cheerleader, but there are enough Koch-powered websites that insist on spreading misinformation. I expect better from InsideEVs, and at least stop resorting to these clickbait titles.

So Tesla offers a system – AutoSteer – where the car will steer left and right (to stay in your lane around highway curves) for you… but you are “supposed” to keep your hands on the wheel at all times.

This is an obvious legal dodge. It is no different than offering Adaptive Cruise Control but telling the driver that they must also have their foot covering the brake at all times when they use it.

It is an incredibly misleading technology that is frankly dangerous, and Tesla’s inclusion of “hands on the wheel at all times” fine print as an attempt for indemnity does not change this.

Exactly my thought.

Spider-Dan said:

“It is an incredibly misleading technology that is frankly dangerous, and Tesla’s inclusion of ‘hands on the wheel at all times’ fine print as an attempt for indemnity does not change this.”

I think this is a subject upon which reasonable people can have quite different opinions, but I think you’ve gone beyond mere opinion here, asserting something that’s factually incorrect.

By default, AutoSteer and other “Autopilot” features are turned OFF when the car is delivered. If the driver wants to activate AutoSteer, and some (most? all?) of the other “Autopilot” features, then he has to do so by choosing that selection on the car’s screen. In doing so, he is presented with a screen which requires him to verify that he understands he is required to maintain control of the car at all times, and that the driver is legally liable for any accidents.

This is rather far from a mere legal disclaimer buried in the fine print in the owner’s manual!

Now, that doesn’t mean I’m arguing that Tesla has completely absolved itself of all responsibility here. But Tesla does at least make its official position on the matter very, very clear.

Well, yes, you have to turn AutoSteer on to use it; I wouldn’t really call that a game-changing observation.

My point is that requiring the driver to keep their hands on the wheel to use a feature that steers left and right for you is a rather transparent legal dodge. It is there solely so Tesla can say, “Look, the driver is technically in control at all times!”

But maybe I’m being unfair. What would you say is the logical reason behind the requirement to keep your hands on the wheel at all times? You don’t need to steer, and you can pay just as much attention to the road with your hands off the wheel as with them on. So why are Tesla drivers supposed to keep their hands on the wheel when the car is steering itself?

This is as a result of just plain stupidity. A model S crashes in the Netherlands without Autopilot and the news is all over the place. People are not willing to take the blame for anything they do. All this media coverage is just plain disgusting. Why don’t we hear about all the other accidents that are in other cars? Because it doesn’t increase their ad revenue.

Autopilot is the Takata airbag of autonomy. Like a Takata airbag, autopilot is supposed to make driving safer but, as implemented, makes driving less safe.

If you want to see how autonomy should be implemented, check out the autonomous systems from Volvo, MB, Audi and a number of other manufacturers make driving safer.

In addition there are questions about the basic technology. AFAIK there has never been a case where a vehicle using adaptive cruise control from any other manufacturer has run into a tractor trailer on a straight road. A $20K Honda Civic with Honda Sensing technology wouldn’t make this mistake.

I don’t see the point in blaming the driver when the technology clearly failed.

And this, children, is how you make a statement that is 100% incorrect.

DonC said: “I don’t see the point in blaming the driver when the technology clearly failed.”

And I don’t see how you can take yourself seriously, much less expect anyone else to do so, when you make wildly biased and inflammatory statements before having the benefit of any actual facts.

It’s not yet known how much blame rests with the driver, and how much rests with the vehicle’s systems – but that will never stop some folks from fabricating their own version of reality to support their out-to-lunch opinions.

Well Don, according to Motor Trend and various other tests of the various OEM’s level 2 driver assist systems, Tesla is actually the best!


However, to serial Tesla-haters like yourself, no straw is to short to grasp when spreading anti-Tesla FUD.

In fact, your incessant FUD spreading here reminds me of the “oh my god there has been another battery fire in one of those EVs crowd”.

I don’t think hands on is that important in the first place and is a false indication of paying attention.

Presuming the driver was watching a movie, requiring him to keep his hands on the steering wheel would not have prevented that!

What’s the benefit of auto-steer?

Cruise control lets me keep my eyes on the road instead of constantly looking at the speedometer to make sure I don’t get a ticket. It also lets me move my right foot (within reason) for improved comfort.

Emergency braking systems, lane departure warnings, etc. have obvious benefits. But auto-steer? If I still have to keep my hands on the wheel and my eyes on the road, then what’s the point?

This is an honest question. Help me understand.

If auto-steer is part of a Level 3 system, where the car is capable of dealing with emergency situations, then it allows the “driver” to relax and it provides a safer ride.

It has no benefit as part of a Level 2 system where the car is not prepared to deal with emergencies. In this case it can compromise safety by giving the driver a false sense of security.

The automation level that can deal with emergencies without human interaction is actually Level 4, not Level 3. In fact, that’s the distinction between Level 3 and Level 4: whether or not it can deal with emergencies on its own.

Just my personal opinion, and based solely on what I’ve read and videos I’ve seen, with no personal experience driving a car with Tesla AutoSteer:

I think any benefits AutoSteer offers drivers of Tesla cars are marginal, at best… at least if used as Tesla instructs. The primary benefit seems to be to Tesla; gathering data it needs to advance the system towards full autonomy.

Or to put it another way: I think AutoSteer doesn’t help much (if at all) in its current form, but by deploying it in this early form, Tesla gains the experience and data it needs to advance the system. Hopefully this will enable faster development of a more advanced version that will be useful for the average driver.

But I seriously doubt Tesla spokesmen would agree. From Tesla’s statements on the matter, they’re already claiming that using AutoSteer makes driving safer.

And maybe they are right. After all, they have the data… and I don’t. I can only go by what my common sense and personal driving experience (with non-autonomous cars) tell me; and sometimes common sense is wrong.

Good job GRA in using/quoting Consumerwatchdog.org which is nothing more then an astroturfing for hire organization:


As I agree with the points they’ve raised (and had previously raised most of them myself elsewhere), who cares if they’re for hire, or who’s doing it? Not me.

This certainly brings up the question of who is paying the known for-hire astroturfing organization to throw hatchets at Tesla?

And how much of the negative anti-Tesla stuff we see in the press originates from the same source?

Confusing? It depends.

Do you think that “Cruise Control” will Control your RV while you go to the back and make a sandwich while Cruising down the highway?

Then you might be easily confused about what AutoPilot does.

(NOTE: Some people may be misinformed that Commercial Airplane Pilots watch videos or take naps while the plane is on auto-pilot. That is a false belief. Pilots are at the controls and alert and scanning the skies, even when auto-pilot is engaged.)

Nix said:

“Pilots are at the controls and alert and scanning the skies, even when auto-pilot is engaged.

Even if no commercial airline pilot never, ever takes his eyes off scanning the skies to talk to share a story with his copilot, or to talk to the stewardess bringing him coffee, which I rather doubt…

Even if that were true, the question isn’t how professional pilots actually use an airplane’s autopilot in real life. The question is what the word “Autopilot” suggests to the average Tesla car driver.