Watch Tesla Model 3 Navigate on Autopilot Almost Crash Before Off-Ramp

NOV 15 2018 BY MARK KANE 89

When it happens, you have just a moment to react.

Consumer Reports rightly noticed that Tesla’s Navigate On Autopilot feature requires improvements because currently you need to pay even more attention than using Autopilot without Navigation.

One of the Tesla Model 3 drivers recently shared a video of a dangerous situation when Autopilot confused the upcoming off-ramp with a small emergency area before the off-ramp. The driver needed to take control as the car turned towards the barriers.

“I was driving southbound on Lake Shore Drive using Navigation on Autopilot on my Tesla Model 3 on Nov. 13, 2018. Based on my route, Navigation on Autopilot, which is Tesla’s most advanced driving feature yet, was about to automatically activate the right turn signal and exit the highway off-ramp.But shortly before the off-ramp, there was a small area of highway for people to pull over in case of emergency. Navigation on Autopilot, which requires Enhanced Autopilot and was engaged while I was on the highway, misinterpreted this.

Thinking that emergency area was the off-ramp, which was right before the actual Belmont off-ramp, Navigation on Autopilot quickly swerved into there while I was driving 45 mph. Holding the wheel already, I immediately disengaged Navigation on Autopilot and veered left to avoid hitting the curb head on.

Software version: 2018.42.4 ccb9715”

Categories: Tesla, Videos

Tags: , ,

Leave a Reply

89 Comments on "Watch Tesla Model 3 Navigate on Autopilot Almost Crash Before Off-Ramp"

newest oldest most voted

First get off the phone, otherwise it’s an important edged case.

He was using Bluetooth, the call was routed through the stereo.

Hands-free phone use is still distracted driving. The improvement of being hands-free doesn’t help nearly as much as people think.

See “Hands Free is Not Risk Free”

https://www.nsc.org/road-safety/tools-resources/infographics/hands-free-is-not-risk-free

Seriously, 31 down-votes? I guess a lot of people don’t like to face the truth! 😯

Maybe you don’t have to defend everything Tesla or its cars do. Just a thought….

Stop making excuses for a Beta SW that has bugs.

Yet another case of “Blame Autopilot!”

http://blameautopilot.com/

Stop making excuses for those people who engage in distracted driving, by trying to blame Autopilot.

Dude, the guy recovered and took control. If he had crashed, then you would have a point.

Wow! You are like the Sarah Huckabee Sanders for Elon Musk!

You’re a bonehead! I suppose he shouldn’t talk to a passenger in a car or read a billboard either?

You didn’t bother to read the article that I linked to, did you?

As they say: “You can lead a horse to water, but you can’t make him drink!”

Before I scrolled down to the Comments section, I knew Pu-Pu would be coming to the defense of Autopilot and blaming the driver. The ‘Sarah Huckabee Sanders’ characterization by another poster here is brilliant and right on the mark.

I’ll have to suffer under the terrible burden of not ignoring reality, and actually being right. 🙄

Pushmi is 100% correct that talking on a phone, hands-free or not, is a serious distraction which leads to many accidents. Talking to a person in the passenger seat is much less distracting, for reasons not entirely understood.

This fact does not excuse EAP’s extremely poor performance, however. Tesla, and especially Musk, deserve every ounce of blame they get for Autopilot.

The entire point of autopilot is so that drivers don’t have to be as attentive as driving by themselves!

It’s “edge” not “edged”

Good bug report for beta software! Lakeshore drive isn’t a highway, it’s a zoo through the city haha.

Those Tesla developers, always with the jokes!
Wow. Nice save though.
Robot version 1.0 going to be interesting.

Strange, I thought the cars always correlated their position with satellite maps. That bus pull-off has likely been there for years…..

The configuration of lake shore drive in Chicago changes quite frequently. During rush hours traffic is routed into opposing lanes using only safety cones. As you can see there is no shoulder on either side and the lanes are extremely narrow. The speed limit changes seasonally. Insanse / moronic Uber drivers stop on the highway to allow passengers to hop the guardrails at North Ave beach. Nobody should trust any dynamic steering system on this road.

Tesla doesn’t use geofenced highly mapped routes. Just onboard sensors.

Not true for the new navigate on autopilot feature – it only works on specific roads. Other autopilot features also behave differently depending on whether or not you are on a divided highway. They have been getting more and more willing to support autopilot lane changes on surface streets.

What possible happend is that the car was expecting to be on the lane furthest to the right. And once it saw the empty line to the right it assumed that it had an error till that point and immediately tried to correct itself. If anything should be done is add open areas such as that one to a list of don’t drive on unless parking.

A human driver on this unfamiliar road would have been confused as well. It is in a place where the offramp should be and looks like an offramp until you see the other curb.

A human driver would have seen the wall at the end of the emergency pullout and not go by the lines on the road.

We have no way to tell if the car would have done the same.

What?! Didn’t you see the video? The car entered the lane!

But the car also could have pulled out on its own. We’ll never know, because the driver took control before that.

You seem to have missed the “until you see the other curb” bit.

You put too much in other drivers. Not everyone is on the good end of the bell curve.

The autopilot often sees obstructions as clear paths. It’s how it works and why it collides with ojects it percieves as open lanes.

More to the point, it ignores many or most stationary obstacles when the car is moving at highway speed.

That’s why drivers using Tesla Autosteer are advised to remain alert and be ready to take over at any time. Which means not practicing distracted driving such as making a Bluetooth phone call.

See “Hands Free is Not Risk Free”

https://www.nsc.org/road-safety/tools-resources/infographics/hands-free-is-not-risk-free

” it ignores many or most stationary obstacles ”

Which is why it should not be released on end-user cars.

I agree, when I first saw this clip. I would also have thought this was the off-ramp. Autopilot should be able to handle the situation better than me as I don’t have an integrated GPS.
Luckily the AP can be updated with information about this danger zone, glad nobody was hurt.

I’d agree that American roads are NOT ready for autopilot, much less human drivers. They are in such dis-repair and poor engineering and design. Glad to see that Cadillac uses their Super Cruise feature like they do, only trusting the best roads that have been meticulously mapped and tested. Until the roads are improved, autopilot will never be perfect in similar situations.

If roads need to be “ready for autopilot” then autopilot will never be ready for roads.

Roads were upgraded with lane markings, speed limit and warning signs, stop signs and stop lights, during the motorcar revolution, all specifically to make roads function better for motorcar traffic.

Roads certainly will be upgraded during the autonomous car revolution, to make roads function better for autonomous cars. It’s only a matter of time until that starts happening.

In Europe we started to remove road markings because it increases the attention of the drivers.

If this was an emergency area, it should have been marked with a continuous line, not the dotted one.

Why is he using Autopilot on these weird city streets in the 1st place??? It’s still only for regular highway use.

What are you talking about?! The crash that almost happened it was on the freeway! 3 lanes and a center wall separating the opposing traffic. What more do you want?

Lake Shore Dr has stop lights. It’s not a traditional highway. Hell, it has Drive right in the name!

I would NEVER engage autopilot on Lake Shore.

A guy here was told by Tesla salesmen the same thing and actually believed it would be good idea to commute long hours on Autopilot.

Obviously he ended disabled after crashing into a car stopped on highway the moment he got relaxed and took eyes off road for a second.
https://www.wired.com/story/tesla-autopilot-crash-lawsuit-florida-shawn-hudson/

Autopilot is no safer than any cruise control driving on it’s own. Except that older cruise control and its marketing doesn’t create sense of false security and doesn’t actively try to kill you, leaving only fraction of second to correct in the last moment.

Highway is the worst actually, because you get impression that it drives itself and loose attention, but it can’t handle classic “cut-out” situation (when front car suddenly drives around obstacle) reliably, nor it can do emergency braking for stationary objects reliably at highway speeds.

“Autopilot is no safer than any cruise control driving on it’s own.”

Complete B.S. All the anti-Tesla FUD in the world can’t refute the fact — that’s fact, not “fake news” — that Tesla cars with Autopilot+Autosteer installed have a 40% lower accident rate than Tesla cars without that combo.

Autopilot + Autosteer isn’t perfect, but it’s certainly better, and in many conditions much safer, than driving without the combo. However, it’s better if the driver doesn’t allow his mind to wander away from driving, and considers whether or not it would be better to shut off Autosteer in some driving conditions… such as the one described here.

“He would relax during the long ride, checking his phone and sending emails.”
“Hudson says that at the time of the impact, he was looking at his phone.”

I’m just happy this moron hit a disabled car that was empty vs. one still full of innocent people.

I wouldn’t mind so much If Tesla only endangered their own customers.

Pushmi said – “Tesla cars with Autopilot+Autosteer installed have a 40% lower accident rate than Tesla cars without that combo.”

AP h/w added AEB, which by itself shows a 40 % reduction in other cars. There is zero evidence Autosteer provides any net benefit. Other high end sedans have lower fatality rates than Tesla.

This is what Tesla said about Q3 2018:
-Over the past quarter, we’ve registered one accident or crash-like event for every 3.34 million miles driven in which drivers had Autopilot engaged.
-For those driving without Autopilot, we registered one accident or crash-like event for every 1.92 million miles driven.

There is nothing there about the second group lacking AEB hardware. For two years, every Tesla made has included AEB. So your statement is a lie.

Not only that, but studies have shown that AEB only reduces rear-end accidents by 40-50%, but only 5-15% of total accidents. So your statistics claim is a lie as well.

Finally, your claim of Tesla endangering others is also a lie, as their crash rate is far lower than average. Fatality rates have huge statistical noise, which is why they have confidence intervals. None of those intervals are tight enough to claim significant superiority over Tesla (at least not yet).

In all fairness, he didn’t give autopilot a chance to correct it self. He took back control right before autopilot would have steered back to safety.

You are joking, right? Are you volunteering for a test?

Send me a model 3… I’ll give it a whirl.

He did his job as a driver paying attention.

Not the best way to put it (I don’t think he should have given Autopilot “a chance”); but I agree that an actual crash would have been very unlikely even without corrective action — unlike the sensationalist headline suggests…

An actual crash was quite likely. AP generally ignores stationary objects such as the light pole and guardrail, so braking was unlikely. It would have tried to follow the white line on the right, but the angle of that line was not designed to be navigated at 45 mph.

If the car followed that line in, why do you preclude the car from following the line out?

Just like the Cosmonauts before Yuri Gagarin, just not patriotic enough to hold their breath.

The real lesson here is that distracted driving is unsafe, and that includes use of a hands-free phone. It’s still distracted driving.

https://www.nsc.org/road-safety/tools-resources/infographics/hands-free-is-not-risk-free

* * * * *

There are always going to be areas where the delineation of traffic lanes and/or off-ramps is unclear.

Part of the motorcar revolution was changing our roads to accommodate motor vehicles by painting stripes on them, installing road signs, and stop signs and traffic lights. As part of the EV revolution, I expect someday confusing areas like this, and road construction zones, will have markers and/or short-range transmitters placed or installed to point self-driving cars in the right direction.

Distracted driving? The autopilot did a bad move and he quickly took back controll. Didnt seem distracted at all to me..

Give it a break with your soapbox dude. He was extremely alert and proved that with his corrective actions. How is what he was doing any different than talking to a passenger in your car? Or do you consider that distracted driving and refrain from it as well? You’re out of touch with reality and must live a miserable life standing on your soapbox and preaching to those who are laughing at you.

He was making a phone call. That’s distracted driving, and that’s a fact. Now, why don’t you actually read the article that I linked to, rather than continue to expose your ignorance on the subject of distracted driving.

You and the other 34 people who down-voted my very relevant and very appropriate comment. Good grief!

See “Hands Free is Not Risk Free”

https://www.nsc.org/road-safety/tools-resources/infographics/hands-free-is-not-risk-free

“How is what he was doing any different than talking to a passenger in your car?”

Controlled experiments conclude talking on a phone is more distracting than talking to a passenger. Meta-analysis of traffic accidents shows conversations with passengers contributes to more wrecks, but that may be because it’s more common.

Crash on Autopilot…wow!

He didn’t crash.

It was a name change suggestion.

And,…still beta.

Everything on at Tesla is in Beta.

Consumer Reports has a knack for making us all feel like crash dummies.

I didn’t buy Autopilot. Its not fully sorted. Even so, I was delighted when my M3received an over the air free trial. I’ve been using ACC, which works great and comes as standard equipment as it should. Comparing Autopilot and Navigate On Autopilot has been fun and interesting. If you bought Autopilot, hopefully you read. A d in doing your research you realize Autopilot is a work in progress. It’s cameras follow painted lane lines and sometimes prominent curbs. Yup are outright irresonsibe if you use it on secondary roads or in town. As it stands it’s freeway/highway only. Users should soon realize the center lanes are best. Problems occur when a right side lane line has a break, such as an offramp or intersection. Other stumbling blocks for the camera are when the line temporarily widens for a pull off area as in this video example. When incidents such as these have happened to me, I have a grip in the wheel. I usually take over but as some have said, Autopilot does catch these incidents andvvrecovers in time. It is a game of electronic chicken myself and most of you do not want to test. With the constant… Read more »

The versions refer to hardware configuration, not to maturity of the software.

Exactly why autopilot is such a flawed concept. I have spent over 50 years honing my driving skills. Why would anyone expect a machine to develop that much experience, flexibility and judgment in a few short years?
Just learn to drive safely and sensibly.
I love electric cars and all the safety tech, but not handing over control to the machines.

Work for a fire dept or ambulance company- then you’ll see why autonomous driving makes sense.

The “machine” can be trained from inputs comprising much more than a few, or even 50 years of driving practice. That’s the strength of machine learning: huge learning sets often allow them to get better results in spite of not being nearly as “smart” as humans. There is a bunch of tasks where AI is already more reliable: including certain image recognition tasks for example… Or medical diagnosis.

Driving is not there yet of course — but it’s only a matter of time.

Add to the AI superpowers: distraction-free, 360 degrees view, fatigue free, multiple sensor types (camera/lidar/radar/ultrasonic), millisecond reaction time. Those can compensate for a lot of the absence of true, universal human intelligence. The latter still dominates however. For how long is anyone’s guess.

Well, I’ll be glad when that actually describes the state of the art of semi-autonomous driving systems deployed in Tesla’s cars and other mass-produced cars.

The current state of the art is rather far from that; no 360° scanning more than just a few feet away from the car, and no active scanning with lidar or phased-array radar. Just very low-res Doppler radar and cameras.

“I have spent over 50 years honing my driving skills. Why would anyone expect a machine to develop that much experience, flexibility and judgment in a few short years?”

That’s probably similar to what Gary Kasparov or Lee Sedol must have thought or said.

Comparing your 50 years of experience and years of development on automated driving is not really valid. These are totally different things and the “years of development/experience” totally uncomparable.

I’m shocked that Lake Shore Drive allows NoA, but it certainly shouldn’t, and the driver should have recognized it isn’t an appropriate highway for NoA!

Navigate on Autopilot is worthless. I simply don’t understand why it was even released. We have a Model X and I was initially excited about it but now choose to ignore it. Proposed lane changes don’t happen until you’ve slowed down 10 mph and sat behind another vehicle for 2-3 minutes. It often suggests a change to the far right lane when on the Interstate rather than the far left (passing) lane putting you behind a Semi or slower traffic. And it doesn’t change you back to the lane you came from. Especially odd when it does on the rare occasion send you into the passing lane. Its a fail on every single level.

Sounds like you didn’t put it in Mad Max mode 😉

Other than that… It’s a work in progress.

“I simply don’t understand why it was even released”

The first launches of the Falcon 1 were failures too. Could SpaceX have saved a lot of money by simply skipping those launches and go directly to the 4th successful launch? In theory, yes. In practice, no. Plodding along in the lab until you have the perfect product is generally not a good strategy. This applies to any product.

Releasing it will allow Tesla to gather valuable training data to improve it.

I just want to acknowledge how he got off the phone and filed a bug report right away, this guy is a good beta tester. Tesla needs more people like him behind the wheel.

Someone’s going to spin this as “Elon’s Murder Machine Makes Attempt At Another Victim. Story at 11”

Some trolls people in this discussion thread already have.

When I tried autopilot during the 14 day free trial in July I noticed a similar problem. When traveling in the rightmost lane of a multi-lane divided highway I noticed that when passing an on ramp my Model 3 would lurch to the right towards the on ramp. I think what’s happening is that at the point, of the on ramp, the system detects the loss of the right hand lane marking and then interprets the right hand lane marking of the on ramp as it’s new right hand lane marking.The system then tries to center itself accordingly causing the car to steer abruptly to the right. It was kind of scary especially with cars coming down the on ramp. I reported this to Tesla.

are we sure it would’ve crashed if he didn’t take control?

No, but would you have risked it if you were the driver?

Arguably, making the driver think the car is going to have an accident is a failure of Autopilot, even if it doesn’t actually crash.

A few years back, I naively said that all an autonomous car had to do was to avoid colliding with anything substantial, and obey traffic laws. But as things have developed, it’s more complex than that. Developers of autonomous driving software need to take into account the human patterns of driving, and mimic them where that won’t cause danger, to avoid undue alarm and stress for the passengers by the car driving in what appears to passengers to be an erratic or dangerous fashion… even if it may be perfectly safe.

What we don’t know, of course, is what would have happened if the driver had not taken back control, here. For all we know the system may have self-corrected. And besides, it’s not like a human driver has never made this error!

Stupid technology does stupid things. It’s not unexpected from a statement that cannot differentiate between stationary things and moving things.

AP differentiates between stationary and moving just fine. The problem is it doesn’t know which stationary things are in its path and which are off to the side.

Kudos to the driver for reporting this weird edge case. What a weird detour, probably trips up some human drivers as well tbh lol.

Hopefully they’ll use this to cover this case and send the update to the fleet quickly so we can move to the next weird one to find.

Just yesterday a truck driver in front of me got confused by a comparable road setting. So I guess it’s still some way to go for the tesla engineers in order to become a better driver than a regular truck driver…

There will always be tricky situations… For this special case I guess it would help to have a more accurate GPS location… Together with a high resolution map…

It would be interesting to know if tesla ai is able to learn from the driver… Are they using some kind of “tricky places” overlay for their maps? Or is the ai just learning one more general behavior… I don’t know details about how autopilot exactly is programed, but I guess they will need some kind of overlay to the general map.

It is nice to see progress in the field of autonomous driving. By now I guess we all know about the different approaches different companies are taking. In various mining sites autonomous haulage trucks already hauled more than 2 billion tons of materials. Waymo is commercializing a service. And tesla is testing the edge cases… Let’s hope that all of those efforts will lead to a better future.

Did that truck driver also ram a parked fire truck with lights flashing?