Watch Autopilot Save This Tesla Model 3 From Highway Crash: Video

FEB 8 2019 BY MARK KANE 37

Those who experience something like this on Autopilot have become grateful.

Tesla Autopilot is an advanced assist system for drivers, which among all pros has one that’s especially important – quick reaction time – which in many cases can save the vehicle from an accident.

One of the Tesla Model 3 owners, Amit Patel, shared a short video of such a situation when the car’s Navigate on Autopilot was engaged on the highway.

“I was driving on navigate on autopilot when the lady next to me suddenly decided to come into my lane. Luckily autopilot swerved and avoided a potentially scary collision.”

We saw similar Autopilot reactions previously and those are very important actions, especially since we as drivers are often surprised when someone tries to enter our line.

Categories: Tesla, Videos

Tags: , ,

Leave a Reply

37 Comments on "Watch Autopilot Save This Tesla Model 3 From Highway Crash: Video"

newest oldest most voted

I shake my head at all the folks that think Autopilot and future autonomous driving is the devil. Human error causes thousands of daily accidents nationwide and yet we focus on the exceptions, not the norm.

“I shake my head at all the folks that think Autopilot and future autonomous driving is the devil.”

Agreed.

But I also shake my heads at people (or machines) that linger in blindspots.

He has not lingering, he was passing on the right side, as you are supposed to do.

Not sure if he was lingering or not, but it still wouldn’t hurt if someone tweeted Elon and suggested that autopilot be programmed not to linger.

This lingering thing happens all the time in Oklahoma where I drive a lot. People hang out in the passing lane even though that behavior is explicitly illegal in OK. There are even signs posted. Other drivers have no choice but to pass on the right. Which is not only dangerous, it is also illegal.

And when OK people pass, they do it at the speed limit. So, their differential to the car being passed is feet per minute not miles per hour. It therefore takes a mile or more for the pass to be completed. Very irritating.

Doesn’t Tesla have a ‘Mad Max’ mode where passing and other maneuvers are done more quickly?

I am from Oklahoma, though I live in Florida now, and I can vouch for the accuracy of the above. Though I would rather deal with that than nutty Florida drivers.

Love it… Good song!

Tesla driver was in the left lane passing traffic, not lingering. Exactly what you are supposed to do. Unless he was going less than the speed limit, all legality is with the Tesla driver.

Don’t know why the driver on the right came over but had they hit the Tesla, the dashcam evidence would convict them. Was it a road rage incident? The Tesla was not passing fast enough so the guy gets in right lane and takes a run at the Tesla? It looks like that is what is going on.

I was speaking about poor driving habits in general.

There should be no argument here. Lingering in people’s blind spots, while not illegal, is asking for trouble. The video cuts in as the Tesla autopilot is casually passing on the right – nothing wrong with that. Or is there? I would argue this: If this were a human driving, there would be no issue here and no one would bat an eye. And rightfully so. We’re stupid bad drivers and our laws have no choice but to account for this – because the alternative is that we make a law for ever grey-area where stupidity is possible or where safety depends on if we are paying attention or not, and well, there’d be so many laws to account for every conceivable instance of possible stupidity, that no one could legally drive! So, while ‘casually passing’ on the left might be legal and technically safe, it does not mean that the driver is aware of what he’s doing and might, in fact, just be coincidentally passing due to a shift in the wind. In the video, they are driving downhill. If it’d been a human driving, the video could imply that he’d been ‘lingering’ for miles on the level pavement… Read more »

….Continued

Practicality is not an issue here. We can afford to question the autopilot and not worry about the practicality of taking decisive action to do everything possible to ensure that stupidity is not what we are seeing in the video. When its a human driver, though – not the case. Practicality intervenes because it’d mean too many laws and no one could drive. With autopilot though?

Just a few lines of code saying: Never, ever, ever, ever linger in the blind spot of another vehicle unless absolutely necessary!

This video merely reminds us that, as a species, we have been conveniently ignoring some prudent and necessary precautions (enforcement? programming?) that have yet to be implemented.

(Sorry for the edits and ‘continued’ comment. Somehow, (human error?) the comment got posted before I was finished and I had to race the clock to edit and lost. Driving with my blinker on, I guess, and then an inadvertent lane change made it look like I was paying attention right before the crash)

So do I. Clearly machines will be much better at this than any human — eventually. But people who think they know enough to proclaim at this point that AP makes us safer are no better. They are as irrational and their conclusion as poorly founded as the other lot. The only data I’ve seen on this comes from Tesla and is average miles between incidents when AP is engaged versus when it isn’t. Assuming Tesla is giving straight numbers (they might not), this still is not data that even supports the conclusion “AP makes us safer”, much less proves it. At first, Tesla used the average for the entire car fleet for “AP not engaged”, but after many pointed out why we should expect that average to be much lower for Teslas than the fleet regardless of AP (no old cars, few young drivers afford $100k+ vehicles) they did come back with some numbers for Tesla without AP engaged. The difference dropped a lot, but the two numbers still potentially contains tons of other biases. For example, there’s generally a MUCH lower incidence rate on highways than other roads. If AP does most of the highway driving and the… Read more »

That’s a fair and well thought out response. A lot to consider, to be sure.

Fair points. I don’t think people don’t consider these matters, i just think many are ready for the future and realize the potential in autonomous driving.

I think Tesla is gathering all the required data in order to compare all such results. The amount of data they are gathering is immense. On the latest earnings call Elon mentioned something like every other company gathering such data was only getting 5% as much as Tesla was. Machine learning applications combined with all this data is going to bring it to market sooner than people might think. My only concern is the effects of bad weather, social acceptance, and regulatory approvals. That could push it out further.

Just another example of how autonomous driving technology is more about safety than just about humans not driving. Calculated, unemotional responses to conditions vs emotional knee jerk reactions.

Exactly. The car with autonomy isn’t ever tired, distracted, or impaired. It’s always silently watching in every direction and paying attention.

Autopilot should also lay on the horn and flip a giant foam finger 🙂

In all seriousness though, it should definitely honk in this situation. I wonder if that’ll come someday.

I like the way you think

Two collisions. If that car would break into their lane and hit the car, the car would go left and hit the barrier, or the human driver could overcompensate while avoiding that car to hit the barrier instead.

The avoided crash probably would have involved all three cars and maybe some behind them. Once cars bounce off a wall back into traffic there is no telling how the chain-reaction would evolve.

In Dallas, these kind of multi-vehicle crashes happen all the time. In broad daylight. People just follow too close.

It doesnt look like autopilot!! It doesnt react that violently!

Correct, Collision Avoidance not Autopilot.

How is collision avoidance not a function of Autopilot? That’s about as silly a saying “It’s not a dog, it’s a collie.”

The person who posted this to YouTube said:

“I was driving on navigate on autopilot…”

Somehow, sir, I think that Tesla car owner knows more about what happened than you do.

Elon Musk said all these cars will have autopilot safety features as standard. Does that mean it doesn’t tend to be activated to avoid a collision like that? I’m doubtful

From what he said, at the launch in March 2016, I think there’s no room for anything less than all of the active safety features to be enabled in all models, including base whenever that arrives.

But then again it wouldn’t exactly be shocking if Elon says one thing, but does something else…

It’s simply amazing how obviously you display your jealousy of Elon.

The driver of a Tesla car has to activate Autosteer for the car to steer itself. So far as I know, no Tesla car will actively steer to avoid an accident if Autosteer isn’t activated. Maybe in the future that will happen, but not at present.

It’s not really possible to tell from the footage whether it really is AP that swerves, but assuming it is, it’s only the second example I’ve seen of it intervening in a useful way — I’ve still seen far more interventions, enough to lose count, in which it contributed negatively, or was instrumental in bringing the dangerous situation about to begin with. Will cars drive better than people? Definitely, and nearly infinitely! They can react a thousand times faster than even the most alert human. Even more important, they don’t get tired and aren’t distracted in the same way humans are (although computer vision can be distracted in a sense). It just doesn’t follow that we’re there yet. I mention what I’ve seen because that, and generally a solid understanding of computers and programming, is all I have to go by. But even if all I had seen was cases where AP seemed to save the day, I shouldn’t be taken seriously if I started to state as a matter of fact that AP makes driving safer. Drawing conclusions based on anecdotal evidence is not serious, and I wish this website would cover stories like this more intelligently — and… Read more »

“Drawing conclusions based on anecdotal evidence is not serious…”

But much, much worse is having a perspective on reality that’s so warped, apparently by an almost insane level of jealousy of Elon Musk**, that you assert Autopilot “…was instrumental in bringing the dangerous situation about to begin with”.

**…judging by your “not even wrong” accusation a few weeks back, in response to a SpaceX concept video about commercial space flight, that Elon Musk was lying about the entire concept. The Force is weak in this one… but the Dunning-Kruger effect is strong!

I can’t imagine a decade is required. Tesla’s auto pilot features just need OTA updates to further activate more features. The capability is there, just more software advances are needed. I doubt any hardware changes are required. I have worked on enough software technology advancements over the years to know. The same computers and servers that were running new software features, sped up processes by 100+ times in some cases. Not to mention all the hardware improvements we see in a decade. Compare your 10 year old smartphone with your latest. No comparison really in what they can each do.

There have been several (many?) videos showing a Tesla car on Autosteer posted to IEVs, videos that showed the car swerving to avoid an accident; but most of them showed only what an alert human driver would have done under the circumstances.

This is one of the very few which almost certainly is a case of Autopilot+ Autosteer saving the car from an accident. I doubt a human driver would have reacted that quickly, or would have steered a correct course away from the intruding car yet not sideswiped the concrete barrier.

An attentive driver would have easily avoided this as well. I have done it few times. In fact, one time a big semi almost run me into the median. He waved later as a sign of apology. it gave a big scare….

I guess autopilot did better than a driver who is unattentive or got distracted with their personal electronics.

Why did the Corolla brake while trying to cut off the Tesla? The driver should speed up and get out of the way. Then again, what do you expect from a Toyota driver? Some of worst drivers on the road today are Toyota drivers because they treat the cars like appliances and have no desire in driving, thus the poor driving skills/experience.

Autopilot does not automatically get you out of someone’s blind spot. If autopilot learned from the driver that would be ideal.