Tesla Autopilot Software Crash While Driving: What You Need To Know

FEB 19 2019 BY STEVEN LOVEDAY 19

What would you expect to happen if Tesla Autopilot software crashed during your drive?

For the growing list of Tesla owners out there, this is very important information to be aware of. While driving with the assistance of Tesla Autopilot or Navigate on Autopilot, there’s always a chance the software could fail, causing the system to crash. This is part of the reason Tesla tells you to remain engaged and in control of the vehicle at all times. But, the real question here is, how does the situation actually play out?

Tesla owner Scott Kubo experienced the above and has shared it to make Tesla drivers more aware. While has was driving (Navigate on Autopilot activated), the technology suddenly stopped working. He says it disengaged and the hazard lights immediately came on, followed by an error message on the Model 3 touch screen that stated, “Cruise not available.”

Any time your car automatically turns on its hazard lights and displays an error message on the screen, it’s going to get your attention. However, the Tesla just reverted to normal driving mode for a few minutes, during which Autopilot was unavailable. Then, it came back on and functioned properly.

It’s critical to understand such technologies and be aware that there may be sporadic issues. Always remain engaged and paying attention to the car’s cues so that you’re not caught off guard.

Have you experienced something similar to this in your Tesla? If so, please apprise us of the details in the comment section below.

Video Description via Scott Kubo on YouTube:

What Happens When Autopilot Software Crashes While You’re Driving

Tesla autopilot suddenly turned off while driving. Possibly due to a crash and reboot of the autopilot software? This was with software v9 2018.50.6 Tesla Model 3, AP 2.5

The system suddenly disengaged and turn the hazard lights on, and came to a stop. An error message said “cruise not available.” For 2-3 minutes I was unable to engage autopilot and no surrounding vehicles were displayed by the media control unit. However, faint lane lines were still shown. After 2-3 minutes autopilot came back on as normal.

Categories: Tesla, Videos

Tags: , ,

Leave a Reply

19 Comments on "Tesla Autopilot Software Crash While Driving: What You Need To Know"

newest oldest most voted

So, in other words, just drive the car.

(⌐■_■) Trollnonymous

“The system suddenly disengaged and turn the hazard lights on, and came to a stop.”

So the car stopped on the freeway or did the driver realize something screwey happened and pulled off to the side and stopped?
Or did the car stop in the middle of the freeway?

A little unclear there.

Re: pulled off to the side and stopped?
It is not unclear if you would have just watched the video, tho.

This also tends to sound like the driver got caught in autopilot jail, by going too fast or ignoring the nag.
In that case, I believe that the autopilot did exactly as it should

Yes! I have had this happen to me before, If i get a few warnings and let two notifications go too long then i get almost the exact same behavior. Seems to be new as of a couple months ago. The thing here is that I don’t have loss of surround view but the car in this video does.

“Any time your car automatically turns on its hazard lights and displays an error message on the screen, it’s going to get your attention.”

It’s a good thing that Tesla also added a loud audible warning to roust the driver in case he’s sleeping . . . err . . . distracted. 😉

/s

So he just happened to be recording when this issue occurrred out of the blue?

Software crash is vastly preferable to hardware crash 🙂

Interesting that he stops closer to the lead vehicle when driving manually, except when he backed off so the car on his right could merge.

Maybe in some cases. I wonder when compared to a (in future) full self driving car.

It occurred to me many times, especially when suddenly there was no painted lines on the road for longer period of time or some weird occurrences. In few cases autopilot came back after driving several miles on well maintained roads. Sometimes is good to reboot the system. I’ve even rebooted the system when being on autopilot mode, however, surprisingly cruise control capabilities where still on during both display screens of my Tesla Model S in total blackout. Just don’t panic – the car is still fully controllable in system reboot situation (similarly to your heart working during the pass-out 🙂

That’s not AP crash, sounds like the MCU rebooted.

Hm. I had this happen for the first time yesterday. I’m also on 2018.50.6, which I’ve been on for ~2 months now. I wonder if Tesla changed something on the server side that suddenly caused issues with 2018.50.6?

The only thing close to this that has happened to me so far is while driving on autopilot on the I-5 near the I-605, where there’s a major freeway construction project going on, navigate on autopilot disengaged and a message came up saying that auto-steer was unavailable. This was most likely due to it thinking I was on city streets instead of the freeway. (outdated maps) As soon as I got passed the section of widened freeway and the maps lined up with reality again, everything went back to normal.

If Ed and Alec are correct, in their comments above, then this isn’t at all a software crash. This is Autopilot acting exactly as programmed, in switching off and alerting the driver when the driver fails repeated tests to see if he’s maintaining control of the car.

However, there is a serious issue here: What will happen with cars equipped for Level 4+ autonomy if and when there is a software crash? What kind of fail-safe will they be equipped with? Ideally, they should safely pull the car to the side of the road and park. But how can they do that safely if the main self-driving system has failed?

If cars are constantly sharing data wirelessly with nearby cars, perhaps a fully self-driving car would be able to rely on what nearby cars “see” in order to avoid any collision while navigating over to the parking/ emergency stop lane.

Redundancy is the answer. At the end of the day it will be up to regulatory bodies to define what is good enough. Which is why the claim that full self driving will be possible with the current hardware in any car is just silly. No one knows atm what the legal hurdles will be.

I’ve had the entire Model 3 center console reboot while driving. It had been exhibiting unusual behavior, such as certain features unresponsive. After a little while driving the entire screen locked up, and some minutes after that it rebooted entirely. The car was driveable the entire time, though I was nervous of course.