Tesla Model S Crashes Into Back Of Van As Adaptive Cruise Control Fails To Stop Car – Video


Even if the car reacts correctly thousands of times on its own, there’s a chance that some situation will trip up the system, resulting in a crash as seen in the video above.

The Model S had active cruise control engaged, yet it didn’t stop for the van on the shoulder as the Tesla’s sensors were instead locked on the car that veers right to avoid colliding with the van.

Editor’s Note:  The YT video by the Model S owner has now been taken off-line (perhaps due to legal/fault issues for the parties involved).  We will re-host if available, full details still available below.

Update 2:  A GIF of the accident has now been made available (GIPHY via Electrek)


The Model S owner who was involved in the wreck uploaded a video of the accident (above) to YouTube (above), along with a rather lengthy description:

None of the safety-systems worked correctly:

1. The TACC, active cruise control did not brake as it normally does
2. The collision avoidance system (AEB) did not make an emergency brake
3. The forward collision warning turned on way too late, it was set to normal warning distance
4. The TACC actually was speeding up just before I did hit the brakes

Yes, I could have reacted sooner, but when the car slows down correctly 1’000 times, you trust it to do it the next time to. My bad..

In normal operation, the AP slows down as soon as another car puts one wheel on the lines to your lane.

The whole front of the car needs to be replaced, including a parking sensor and a steel beam.

I was in contact with Tesla Europe, but they could not provide me with any useful information. They just stated that “all systems worked as expected”. Well, certainly not how I expected them to work…

No automated system is failsafe though and in the end it’s still the driver responsibility to be alert and to avoid being the cause of an accident such as this.

Commenter Viktor tells us that Tesla specifically warns of this situation in the manual:

“Warning: Traffic-Aware Cruise Control can not detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object, bicycle, or pedestrian is in front of you instead. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death. In addition, Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model S to slow down unnecessarily or inappropriately.”

via Reddit

Categories: Crashed EVs, Tesla, Videos

Tags: , , ,

Leave a Reply

141 Comments on "Tesla Model S Crashes Into Back Of Van As Adaptive Cruise Control Fails To Stop Car – Video"

newest oldest most voted

Driver is an idiot.

For expecting Teslas systems to work?

The safety problems they have had are concerning. Especially the AEB (or rather the lack of it)

I expect a lot of improvement before I get my Model 3.

No, the driver is an idiot for trusting 100% in autopilot. I don’t know how many times Tesla has to say that the current autopilot is NOT a substitute for keeping your hands on the wheel and paying attention.

Sure, driverless cars are getting better, but they’re not there yet and any driver who diverts their attention away from the road because they trust the system… IS SIMPLY AN IDIOT. There’s no two ways about it.

“I expect a lot of improvement before I get my Model 3.”

Yeah, ‘expect’ is a bit of a self-entitled word to use. You do realise Tesla are ahead of all other automakers at the moment in the autonomy department, right?

Accept the fact that autopilot will be what it will be when you get your Model 3. If it’s not so much better than today’s version, then maybe it’s because it’s not as easy to develop as you think.

“No, the driver is an idiot for trusting 100% in autopilot. I don’t know how many times Tesla has to say that the current autopilot is NOT a substitute for keeping your hands on the wheel and paying attention.”

I find this reasoning rather simple or harsh.

The last time Volvo stated the “false sense of security” of the system, everyone bashed Volvo. Last accident that the owner didn’t trust the auto brake system and braked on its own, people called the owner stupid for NOT trusting the system by getting involved which would disable the system. Now, this accident, we also complain that owners are idiots for “trusting the system”?

So. which way should the owner behave?

That is exactly what Volvo stated that unless you have full automation, the “auto pilot” gives owner a false sense of security.

That is exactly the problem here. If you can’t trust the system 100%, but stepping in manual will disable the system, then what is the system good for?

The person who braked on their own, and then stopped braking was an idiot too.

If you start braking because something is in front of you, keep braking. Don’t take your foot off and think the car will finish what you started.

“If you start braking because something is in front of you, keep braking. Don’t take your foot off and think the car will finish what you started.”

Is that the case confirmed?

It sounded like she braked too late since she perceived that car wasn’t braking on its own.

Did Tesla release the information on the time between the brake pedal pressed, speed of the car at the time and the time collision happened?

From what I know, Tesla only stated that she pressed on the brake which disabled the auto brake feature.

What the problem is here is that to begin with the auto braking safety feature does not engage until it has to but by that point it’s too late for the human equation to react fast enough to avoid the incident. That being the case why even have the CAS built into the vehicle. If it were not equipped with it I am pretty sure the operator of the model s would have braked long before the avoidance system kicked in. Nice idea to have but needs Alot of tweaking.

No, driver is not an idiot.

This is what you should expect to happen:
– when the system works: the number and severity of collisions will be reduced
– when the system fails to work: drivers, having become accustomed to the system working, will become less attentive and the number of collisions will increase.
– overall: the number and severity of collisions will be reduced.


Known as the automation paradox.


Disagree. The driver is severely at fault here.

The person at the controls of any vehicle are called “drivers” because that is what they should be doing — driving.

Tesla’s system are designed to *help* keep the car safe when the driver misses something, but the driver should still be *fully-engaged* in the act of driving.

Certainly, watching for stopped vehicles in front of you, and avoiding collisions with them, is one of the primary priorities of a successful driver.

I’m not saying that Tesla’s system couldn’t have done better here, but they don’t excuse the very poor driving clearly exhibited in this case.

yep. Driver is a human. All humans are idiots. Tesla should expect this.

Autopilot sucks.

Tesla is depending on too much legal weaseling. Referring to the manual to blame the driver when the system is clearly defective reminds me of the opening of “Life the Universe and Everything” where destroying houses or planets is justified the actions based on the plans being available at the local planning office or Alpha Centauri.

What sucks is FUDsters using InsideEVs as a megaphone for short-seller serial Tesla bashing, motivated by nothing but greed.

What sucks is you accusing everyone who has a critical opinion of Tesla or a Tesla feature of being a Tesla FUDster, Tesla short seller, and/or Tesla basher.

Although I must compliment you on accusing this commentor of being a Tesla FUDster, short seller, and basher in only one short sentence!!! Usually, it takes you a huge block of text to call someone all three of these pejoratives amongst a very long rant. Thank you for the brevity! 😀

Sometimes I wish Tesla were’t a public company; rather than confront their own cognitive dissonance, as it gives people an easy way out: obviously that person is a short seller. But of course, when they encounter praise for Tesla, they never dismiss it as coming from a long who is trying to pump the stock.

What the hell are you yapping about now? Maybe what you just wrote makes sense to you, but it sounds like a load of gibberish to me. No doubt it’s anti-tesla, though. Since we all know you’re just that fool Three Electrics with a new name.

Actually, I like the Model X rather a lot. So much so that I bought one. Don’t let my objectivity get in the of your ad hominem bullying and name calling, however. I suppose this is where I’m supposed to point out, baselessly, that you’ve been turned into a pro-Tesla troll from your massive holdings in TSLA stock? I’m sorry; I guess I’m very bad at your game.

Um… perhaps if you read it again its meaning might become clear. It certainly is to me!

Are you always this angry?

“What sucks is FUDsters using InsideEVs as a megaphone for short-seller serial Tesla bashing, motivated by nothing but greed.”

I find your calling of anyone who criticized Tesla of being a FUDster unfair or unjustified.

Tesla fans called the lady in LA idiot for taking over control but crashed because she felt the system didn’t act. But stepping in disabled the system. Now, the owner is trusting the system, but it failed. So, if the system doesn’t work 100% of the time, then it can be a reliable system since owner would have to make a split sec decision to figure out whether to trust the system or not.

If that is the case, then it is reliable enough to be trusted.

It is NOT reliable enough to be trusted.

Disagree. Back then there was a similar situation when autopilot just came out (and the person was using TACC, no autosteer yet), I looked at the manuals of BMW and Mercedes and they say exactly the same thing explicitly (system does not react to stationary objects).

ACC radar is simply not designed to react to stationary objects, esp. when the car they are following leaves the lane.

I disagree with you. It’s my opinion that when using Autopilot (TACC is on and Autosteer in on), the TACC radar should detect and automatically brake for stationary objects that are between the left and right painted lines.

I think the main reason most TACC does this is they won’t want false positives where the car slams on the brakes just because someone is over the lines (when the driver can easy steer to avoid). Rather it likes to err on the side of continuing (as in the logical extension of regular cruise control).

It is unclear in this case whether autosteering (or even the ACC) was active. I’ve seen Tesla owners comment and say that autopilot can steer around obstacles it recognizes (and the owner would know if it detected the stationary car, since it would show up on the display).

What car do you drive? We can talk about your marvelous wheels compared to Tesla , I promise we will troll Tesla together, please let us know.

Commenter is a Tesla fanboi.

Who is using InsideaEV’s to promote his long position on Tesla stock….

This is clearly the thing that Tesla write about in the instructions about the Autopilot.
“Warning: Traffic-Aware Cruise Control can not detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object, bicycle, or pedestrian is in front of you instead. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death. In addition, Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model S to slow down unnecessarily or inappropriately.”

Thanks Viktor. I’ll add that in!

So its autopilot but not in all cases, so use it and relax, except be alert for specific situations where the system will not work and instead run over a person.
Are you not better just driving?
Having to understand a list of situations where the system does not work is far more stressful and likely to cause issues than being fully focused on the whole task.


Agreed. My e-Golf certainly initiates emergency breaks when it sees stationary objects and it’s got no AP just ACC, lane assist and collision avoidance including and very importantly stationary objects. Having a Model 3 reservation and being a big time Tesla fan, I think they need to change the system so it also reacts to stationary objects like trees, parked vehicles, buildings etc. it could hurt their image in the long run if they don’t improve.

If you don’t like the feature, don’t pay extra to get the option in your M3.

Done. See how easy that was?


Nice find!

I’m glad Tesla is aware of this issue, but this feels like an attempt at victim blaming. Tesla cannot hype autopilot and then bury these kinds of caveats in the manual to cover their collective asses.

Of course they can and pretty much EVERY company on this planet does it in some form or another.

They’re hardly going to sell their cars if they shout “check out our autopilot system, it doesn’t work all the time and if you overly-rely on it you might have a crash, but it’s cool nonetheless!”

Common sense, really. Companies advertise the pros but the cons are buried away in the manual, small print, etc. If you think Tesla can’t get away with this, well, they can because it’s not illegal unless it’s deliberately misleading.

Oh, and yes, victim blaming…

You do understand it kind of IS the victim’s fault, right? Tesla have only said like a billion times that autopilot is not a replacement for being observant and prepared to take control at any time. They’ve only said a billion times you can’t just put autopilot on and stop paying attention to the road.


You called?

Yes, autopilot functions almost exactly the same as all the other systems: ie, it does NOT mean that you can check out of paying attention and like any driver assistance there will be edge cases and issues as it progresses.

Anyways, the facts and logic will never stop the haters who keep posting their negativity on Tesla here ad nauseum.

^^ nicely played!

In case people think this is Tesla only, they should spend some time looking at the manual for ACC in other cars.

For example:

“The system does not decelerate when a stationary obstacle is located in the same lane, e.g., a vehicle at a red traffic light or at the end of traffic congestion.”

“DISTRONIC PLUS does not react to: • people or animals • stationary obstacles on the road, e.g. stopped or parked vehicles • oncoming and crossing traffic As a result, DISTRONIC PLUS may neither give warnings nor intervene in such situations.”

“Adaptive Cruise Control does not react to people or animals, or small vehicles such as bicycles and motorcycles. It also does not react to slow moving, parked or approaching vehicles, or stationary objects.”

The owners manual is a glorious thing, indeed, for it allows a salesperson to tell you something which isn’t true, and then later, when the s–t hits the fan, to have it be your fault! That’s some trick.

I believe they all say the same thing because they all use technologies from Mobileye. Tesla does, too.

Yes, a lot of them do … except Tesla has some MAJOR differences. I bet you know what they are… and you’re just not sharing.

At 04 seconds you can hear the warning sound for the driver. It is a very difficult situation to program a machine for. You can see that the Tesla was following a car. That car moved to avoid a stationary van. So the Tesla saw TWO cars in the same lane. One was stationary and the other moving. The Tesla reacted by asking the driver to take control. You can not blame the programmer for that. It is a rare case and surely the computer will learn from it, but as it is I would say well done Tesla because the program asked the driver to take control and the driver was the one who was slow to react.

I’d like to see stereoscopic cameras to help with depth perception. I think they would have seen this stationary object much sooner than when you hear the Tesla start chiming.

The further apart the cameras the better; as long as the software can make sense of the data.

Computer software doesn’t perform well at translating camera images into virtual 3D images which the software can map and use for collision avoidance. That’s why Google’s self-driving cars use lidar scanners.

Video cameras should be restricted to giving the driver better views. I really like the backup camera in our new car. That’s what it’s for; for showing images the human brain is good at interpreting. Computer programs… not so much.

That’s how Subaru eyesight works right? I wonder how well their tech works.

Subaru and Tesla’s technologies are both sourced from the same vendor: Mobileye.

Maybe current computer software has difficulty, but w/deep learning algorithms I think it easily could decipher the camera data.

If Google knows what a cat looks like, then a car can tell how far something is.

@SamEV – Mercedes had a stereoscopic system, but I don’t know how well it worked compared to other systems (or in addition to them). I think the image processing software is progressing much faster and wouldn’t rely on old information. I would also trust Silicon valley programming for this type of stuff vs. German programming.

Interesting observation. Food for thought. 🙂

The beta test continues.

Maybe there’s a good reason why other car companies ON PURPOSE adapt ADAS slow and test longer.

You should see how the adaptive cruise works in the i3. Tesla is not the only one with issues…

Collision avoidance should clearly override “normal” ACC in this case.

There’s a reason that there are two systems in place.

Or let’s (once again) “blame it on the driver” because Tesla can do no wrong?

The system needs to improve and I’m sure it will but so long as a humasits behind the wheel and is directed to stay in control the computer and Tesla should not be caused to shoulder the entire blame.
There is a video on this site showing a driverlooking to be asleep behind the wheel of a Model S in auto pilot mode. Everyone I’m sure has seen it. If the car were to crash because it was in auto pilot would Tesla b at fault? Clearly the driver was inattentive.

Tesla could install sensors to prevent that situation. Volvo, MercedesBenz, and GM are testing motion sensors that detect when a driver is falling asleep. The Volvo system “works by having small sensors located on the dashboard determine the direction a driver is looking, how open their eyes are, and their head position and angle. The sensors monitor these actions through the use of small LED lights that illuminate the driver with infrared light, which aren’t visible at all to the driver.”

I wonder whether governments will soon require these motion sensors for autonomous driving cars.



tftf continued his serial Tesla bashing:

“Or let’s (once again) ‘blame it on the driver’ because Tesla can do no wrong?”

Let’s (once again) highlight a very rare case where a Tesla car has failed at doing something difficult, and use that rare event to promote Tesla bashing by pretending that’s commonplace, instead of a rare outlier.

And it’s more than a bit hypocritical of someone who never writes a genuine or honest post to complain about people who think Tesla can do no wrong.

“…a rare outlier.”

So a car parked on the side of a road because of an accident etc. or halted is a rare outlier?

Yes indeed. In my daily driving I interact with 100’s of vehicles each day. This situation has happened perhaps a handful of times over the 30+ years that I have my driver’s licence. If that isn’t an outlier, then what is?

As a driver in a densely populated area, I encounter this situation NUMEROUS times a year. In NYC, many highways don’t have shoulders, and therefore there is nowhere to pull over if your car breaks down or gets into a minor fender bender.

Look, I don’t have a dog in this fight, but assuming you drive almost every day, and encounter hundreds (if not thousands) or cars each day you drive, then “numerous times each year” is still an outlier.
300 driving days at say 200 cars encountered per day is 60,000 data points. If numerous times each year means 2 or 3, or maybe even 5 to 10, that is still outlier territory when compared to 60k data points.

If you hit cars in these situations, one per year is already too many.

Only if your goal is to achieve zero accidents, which human driven cars certainly don’t. The preliminary data from Tesla shows that Autopilot driven cars get into accidents at a rate that is 50% lower than human driven Teslas. Even if you have an agenda against Tesla, it’s really hard to argue with data.

Exactly correct on tftf.

A stock manipulator who refused to be ethical in disclosing that he is short on Tesla while he serially spreads his self-serving anti-Tesla FUD here.

Ler mw know which system is designer better in your opinion (quote from TMC):

“However, people assume that Tesla Automatic Emergency Braking is the same as Volvo’s and Mercedes’.

Tesla Automatic Emergency Braking is not an accident avoidance system and does not brake to a halt. It’s programmed to reduce the speed and thus reduce the force of collision.

In the mean time Volvo and Mercedes Automatic Emergency Braking is designed to come to a halt regardless what the driver does (such as when Tesla blaming driver for applying the brake.)”

Agreed, there was so much a driver could do to prevent the accident. The AP is not supposed to be 100% foolproof – to avoid trouble, stay alert, steer away of the obstacle and day saved, everybody happy. So agreed, own fault, now pay the bills.

Tesla needs to be aware that winning a legal battle won’t help them win a narrative war.

Exactly. Blaming your customers never works.

Lidar easily handles this situation (and others). Tesla chooses marketing over technology, using the term “autopilot” without the equipment to back it up.

They need to be pro-active and prepare for a 60 Minutes “expose” on how rich folks in Teslas are putting the rest of us at risk. Quoting some legalese in the owner’s manual won’t cut it.

You know.. As a side note. Why in tarnation was that van parked there to begin with? Unless they had a breakdown, that is a horrible place to be, no cones or anything. Even if all of the vehicles are human-controlled, it is only a matter of time before somebody runs into it, or another accident is created as people try to go around.

After the impact, you can see there is a car with its right flasher on, directly in front of the van. Perhaps they had a fender bender shortly before the Tesla came along.

There’s blame to go around here. To me, the expected behavior of the car would have been for it to stop completely, since there was a stopped vehicle in its lane and it wasn’t given any command to go around it and the driver made no attempt to steer or merge. So the driver is at fault for not steering around the car as he should have, and the car is to blame for not coming to a stop.

Regardless of what car the Tesla is tracking or what odd situation was ahead, these systems should never allow such an easily prevented low speed collision with a stationary object directly in the path of travel.

I don’t think the camera recognized the van as a vehicle until it was too late. It had a hard time picking it out against the overpass and bridge supports. The van is painted with diagonal and asymmetric coloring, this probably didn’t help the camera’s software pick it out as a vehicle. Most vehicles are symmetrical.

I think the Teslas are purposely trying to destroy their ugly nosecones, hoping for a nose-job. 😀

Seriously though, this at least is more data. The more data the better. Sucks for the people that do the beta testing and find the weak spots in the system. Hopefully no one gets injured in the process. I like to think that autopilot has prevented more injuries than it has caused, even in this early stage.

And that is why you don’t rely entirely on AutoPilot. There has to be a human at the wheel and paying attention.
This year.
Next year… Who knows.

I’m afraid this is going to become like the media attention to the very few battery fires in EVs. It’s going to give some people — probably a lot of people — the wrong impression that the newer tech is less safe, because the media never runs stories about fires and accidents with the old tech.

I’d like to see statistics on how often the ACC (Adaptive Cruise Control) system fails to brake the car to avoid an accident, as compared to how often human drivers fail to brake to avoid an accident.

Did you realize that there are (or should be) TWO systems involved, not just ACC?

As for system two: Why doesn’t Tesla system bring the vehicle to a COMPLETE HALT in this situation – as competitor designs are designed?


This case illustrates the fundamental problem with autopilot. People become complacent and are not capable of maintaining situational awareness and regaining control of the vehicle in the short time necessary to avoid a collision.

This problem is not going to go away.

It’s only a fundamental problem if the _overall_ effect is worse than before.

As with many individual incidents that appear in the news media, they say nothing about the overall trend.

That’s simply not true. Human psychology doesn’t work like that.

Suppose that you had a car that could instantly deploy foam throughout the car in case of a collision, which eliminated over 99.9% of collision fatalities and injuries. However, there is a 1/10,000,000 chance that the system can randomly malfunction and deploy+ignite the foam, causing you to be inescapably trapped as you quickly burn to death.

Objectively, it’s worth the very small risk of being burned to death to get a 99.9% chance of surviving any other accident. But in the real world, no one would use that system.

“Complacent” started when he saw the van ahead. Using AP, I wouldn’t leave this situation to the car. Nobody lets their AP merge them back into traffic, after tolls, etc. It doesn’t work.

What makes me have some sympathy for him, is that the car appeared to accelerate once the other car moved out of the way. He may have had less time than he thought, because of TACC’s mistake.

Running “beta”, I’d size it down to two things you worry about. Dumb things the car doesn’t see, and having to deal with wrong actions it makes and how strongly those actions might happen (steer/brake/accelerate). So far, I’ve never had an action I’d assess as too strong, or one that limited my reaction time too much. YMMV.

Tesla’s Emergency Braking is only designed to lessen the impact of a crash (i.e slow you down),… if the driver steps on the brake pedal the emergency braking will disable.

I see this as the driver’s fault, .. and a reminder to all of us to read and think carefully about how your car’s advanced safety features operate, .. what they will and will not do.

You were just describing the design flaw that makes the Tesla unsafe.

When crashing into a vehicle is not a system mailfunction but actually how the code was written it’s even worse.

It would be better if they didn’t have a AEB-system than one that doesn’t work as it is supposed to do and actually prevent crashes.

That is an engineering fail. Most companies hire human factors engineers and user experience engineers to avoid precisely that kind of issue. Watch this video of the kinds of design criteria that is critical: https://www.facebook.com/Vox/videos/487210271466580


The car wasn’t traveling over 50MPH. My concern is why front collision avoidance system (AEB) did not make an emergency brake in a timely manner.

That should take precedence over any and all computer or stupid human trips — it didn’t.

Simply put, the collision avoidance programming didn’t work.

Who did the braking? I don’t see or hear heavy braking at all.


Emergency Avoidance System certainly did NOT and that’s the FAIL.

People can say ‘hey he had a foot on the brake’. Emergency Avoidance should override this. Backing up systems have this override and appears other companies do so too which makes sense.

Car preservation/collision avoidance is paramount and should override ‘stupid human behavior’ that we all guilty of doing from time-to-time.

As much as I love Tesla and hope to get into a M3 soon, they failed on this one and need to correct the programming priorities. Fast and sleek autopilot is nice, but safety should be #1.

It’s a double-edged sword. If a car can brake hard on its own with it thinks there is something in the way.. What happens when there is a false positive? Fine, you car brakes hard in the middle of the freeway for no reason, but the guy behind you may run into you.


Let’s say autopilot is NOT in the equation. The emergency system should stop the car to avoid the collision.

Sudden braking at times is necessary. In this case it was warranted and the system failed to do so. Anything behind subsequently if hitting would have happened regardless of a sudden brake or not.

You can tell at the very end of this video there was a cascade to another vehicle parked ahead too. It happens.

Regardless of the antecedent, there was a failure of the emergency collision avoidance to engage properly to avoid the collision. That is a failure of that particular system to perform the task.

Well most likely the systems failed but there IS also the possibility that the owner tapped one of the pedals cancelling the emergency braking. Wouldn’t be the first time someone tried to wiggle out of responsibility you know.


If you think Tesla’s system will stomp on the brakes and avoid avoid a frontal collision — think again!

All Tesla says it will do is LESSEN the impact. And if your foot is on the brake it won’t do anything at all.



The programming priority and safety premise is wrong, hence the collision happened = failure of the intent of the program.

Did the program work as programmed. Probably. It’s just a BAD PROGRAM since the intent of avoidance collision was not realized = failure.

RTFM doesn’t won’t help when the programming intent is wrong.

This isn’t autopilot.

This is the failure of safety feature that didn’t activate in timely manner.

So why do Volvo and Mercedes (and others) emergency braking systems bring the car to a COMPLETE HALT (or at least try to) in these situations?

And what if Tesla changes the system again overnight with a new update?

As the driver in the linked video noted (see his comments on YT attached to the video), he wasn’t even going very fast when the accident happened – lucky for him.

Perhaps because every manufacturer doesn’t want to assume liability for every frontal collision …? That’s what you’re asking for, .. isn’t it?

Volvo Collision avoidance System Fail

Only that the video in the article is not about detection of pedestrians / animals but a LARGE van on the roadside.

Wow, .. really? Ok, thanks.

I’ll go back and watch it again.

Meaning it’s a totally different problem / software AI problem.

If I were you and I was smart (definitely not the case) I would be arguing that it IS the same software/system problem … in both videos the very same issue “comes to light”.

The Volvo in that video didn’t have collision avoidance systems. The idiot driver THOUGHT it did.

Moral to the story … never put your life on the line under shady circumstances.

Another loss for the Software Jockeys. Safety is a mere 3-5 software revisions away and no telling how many crashes.

Tesla needs to stop using its drivers as guinea pigs.

“Tesla needs to stop using its drivers as guinea pigs”

What a retarded comment. Tesla isn’t forcing its drivers to use autopilot. Heck, Tesla’s plastering the ******g screen with warnings and beeps and whatnot to make sure the driver is holding the wheel and reading the road.

Seriously, autopilot is an optional feature. Guinea pigs don’t have a choice in the matter. Tesla drivers do.

Don’t confuse TeslaHater 54 with any facts. He is a long-time serial Tesla basher on multiple forums.

100% right.. Tesla IS using their buyers as guinea pigs. And you don’t have to be a Tesla hater to see that.. just have functional grey matter between your ears.

AutoPilot is for when things are boring. As soon as the driver see anything abnormal they should take control. Don’t push the limits of the system. Someone will get hurt.

100% right. Human fault using the system in a way that is dangerous ie. When road conditions are not safe to warrant use.. heavy rain, snow, ice, traffic, winds, dust, fog, smoke, etc..

Where is the evidence, that the car was on autopilot or on TACC?

The pre-colission system is allways on and produces the beeps, but this means not necessarily, that autopilot or TACC was activated.

A while back I test drove a i3 REX with the Driver Assistance Package which includes the ACC. Wanted to see how it worked as I had never tried it before. I had it set and the traffic I was behind moved over a lane. The system didn’t react at all to stopped traffic ahead and instead actually accellerated. When it became apparent that we were about to have a “bad day,” I stomped on the brakes hard to stop us before the imminent crash. In all that was a bit of good news – at least the magnificent Gen X salesman sitting in the front seat never had to look up or interrupt his texting.

The idea that auto pilot can’t or won’t fail is absurd. As the sun rises and sets, there will be both software and hardware failures with autonomous driving systems and people will die as a result. But because of the threat of lawsuits, all such failures will be blamed on the driver.

Any so-called. autonomous system that require constant human oversight and split second intervention on the part of the driver to over ride the auto pilot’s screw ups is not really autonomous.

You might as well just turn the dam thing off and drive yourself. The lawyers are gonna have a field day with this one.

This looks like a perfect situation that Ford spoke about when they decided to skip Level 3 automatic driving, and move directly to Level 4. “Ford wants to skip Level 3 because it presents the one of the biggest challenges with this technology: How to safely transfer control from the computer to the driver, particularly in an emergency. It’s a balancing act, one that requires providing drivers with the benefits of autonomy—like not having to pay attention—while ensuring they are ready to grab the wheel if the car encounters something it can’t handle. Audi says its tests show it takes an average of 3 to 7 seconds, and as long as 10, for a driver to snap to attention and take control, even with flashing lights and verbal warnings.” In this situation, all the driver had to do was touch the break, which told the system he was taking over, but the driver did not break enough. Just maybe the system needs to have it’s own fail safe, where it won’t run into anything…Period. Because no matter what the instructions/warnings are about the system, the expectation is that if it’s auto-braking is active, there is an implied level of confidence… Read more »

I don’t understand the point of having a full range of sensors if it cannot prevent this. It sets up for some really dangerous crashes. Automated stopping has to be more aggressive, if auto-pilot is to succeed. Current implementations of lidar is expensive, and obnoxious looking, so not sure if that’s the answer.

Aggressive braking can lead to aggressive rear collisions.

Exactly, it is not that simple. Some of the worst wrecks happen when someone stops so suddenly that they get rear-ended at high speed.

If someone’s braking hard, they’re doing so for a reason. If I was slamming on the brakes to avoid rear ending someone in front of me, or to avoid hitting a kid running into the road, being rear-ended is the LEAST of my worries.

The best solution to avoid being hit from behind when being tailgated, is to increase the gap to the vehicle in front and give yourself more stopping distance.

Thanks for the driving tips, mom.

No auto-braking system currently brakes that aggressively because in most situations it would be better to hit the car in front of you at slower speeds than to get rear-ended at higher speeds and be pushed into oncoming traffic.

The driver is free to brake more aggressively if they feel the situation warrants it.

And so the autonomous driving will deal with dangerous situations by crashing into objects? Does not sound like this technology will ever succeed if that you answer.

It must be the van or the driver’s fault … it cannot be the faultless, well ahead of anyone Tesla system right?

On the other hand, I’d expect other manufacturer’s having a similar experience at some point …

I would encourage everyone to try to be fair. For those that jump to blaming the driver – would you be so forgiving if this was a non-Tesla product? For those that jump to blaming Tesla – all technology advancement has risks and learning curves. The key question is is NET safety enhanced? Is human freedom enhanced? (being able, at least eventually, to travel more freely even with physical disabilities or adverse conditions) Human drivers are not infalliable either. I’d just encourage everyone to be as fair and impartial as possible – and not simply line up with your tribe/team. Automation is tough for just these sorts or reasons. Humans, by nature, stop paying attention when something seems to be taking care of itself – especially tedious tasks. Expecting the human to actively monitor has been a challenge in aviation – and that is a far easier task to automate than in traffic with vehicles – potentially at far different speeds – just feet away.

The lesson here is that you should always take over when there is an unusual situation (stopped car, construction, bicyclists, lane obstructed, road debris, etc)

The system is called auto-pilot for a reason. A pilot in an airplane would similarly take over controls if an unusual situation was developing.

This is clearly the driver’s fault, and I would love to hear the arguments of how an emergency braking system should have been used to deal with a situation where the driver should have taken over control and merged with traffic in the right lane.

The YouTube video has been made private.
Anyone know if the video has been re-uploaded somewhere else as public?

The video shows the model S owner was negligent/at fault, .. so posting it online — not a smart move. I’m gonna guess his lawyer advised him to take it down.

Oh really, the video didn’t even show car data or his legs / hands.

Read the desciption posted in the article, the owner clearly thinks otherwise.

Just goes to show that true autonomous driving is a long way off.

While having the car drive and control itself is neat, if you have to constantly watch it and override it when something unexpected happens, I might as well just drive the car myself…

Whatever Tesla calls this mode of driving, it should not have used the word “auto” as in autonomous. They should have used the word “assist” instead as in “assisted driving mode”.

They use “autopilot” as in the same system as a plane. The pilot has to constantly watch it and override it when something unexpected happens, but it reduces fatigue because the pilot no longer has to deal with small adjustments.

Same idea with “autopilot” here. Unfortunately, some consumers have the wrong idea and just read the “auto” part.

Yeah even when most of America gets true autonomous driving, Europe will be quite far behind because our roads are generally far more complex and chaotic. I’d like to see an autonomous car drive through London rush hour traffic and make programmed decisions how to avoid obstructing oncoming traffic when the road narrows ahead, to avoid parked cars, to predict where a bicycle is about to hop into the road in front of you, etc.

Two things about all these autonomous systems make me nervous.

First as a motorcyclist I worry that the cameras won’t detect me properly and slow down.

Second, it’s kind of like getting a bicycle with suspension; ride for a couple of years with suspension and when you go back to not having suspension, you struggle. It has the tendency to make drivers worse by over-reliance on it.

Most likely scenario:

Driver has gotten too complacent with Tesla’s safety features.
Driver is waiting for car to stop automatically for him.
Driver hears multiple beep audio warning … so he assumes car will stop.
Driver (lightly) presses brake which disables “auto braking”.

/The way I understand Tesla’s “connected” system, .. they have probably already analyzed the whole event and know exactly what happened.

I’m sorry that it happened to be a Tesla this time that happened to suffer a failure, but I am not upset at all that these systems are not flawless. I’m planning on buying a BOLT soon, and the only feature I do NOT like about it is the vehicle is having all kinds of effort put into it to make it drive autonomously. When this system is perfected, those of us who enjoy just driving electric cars will be effectively prevented from doing so. Insurance rates of course will go sky high for anyone insisting on driving themselves. Its rather analogous to the “Smart Meter” situation in the parts of the country that have them, which incidentally is 50% federal gov’t funded, and many utilities, mine, British National Grid included, wants State taxes to pick up the remaining funding. Of course, the consumer in some places must pay $18 per month to have a ‘non-smart-meter’ installed at their homes. The obvious question is how did the utility survive when all meters were ‘dumb’. Of course, PG&E as an example was forced to admit that the ‘op-out pricing’ was not due to any actual costing, but merely this is the… Read more »

Hmmm…interesting point – I could see that problem happening with insurance rates about a decade from now, if that.

Right now, though, insurance rates might go up if your car does have any sort of partially autonomous features.

“When this system is perfected, those of us who enjoy just driving electric cars will be effectively prevented from doing so. Insurance rates of course will go sky high for anyone insisting on driving themselves.”

Whatever happens with ACC we can be sure that the auto companies will dodge äny liability or financial responsibility and say “The. driver is guilty because he failed to over-ride the auto pilot.”

We can be sure the auto companies who are now pushing ACC will disavow any legal responsibility when it screws up and kills somebody.

The beloved insurance companies will of course have their lawyers crawling all over this rubbish heap trying to figure some a way to raise rates. The likely outcome will be a surcharge for anyone who still has the balls to drive down the freeway with the autopilot turned OFF..

Very interesting that the manual actually warns against this exact circumstance.

This is exactly what happens when there is a parked car on the shoulder. Other drivers that have normal functioning cars will move to the right. What Tesla saying is that autopilot will work only if the driver in front collides with the stationary object.

You can’t expect the driver to memorize the user manual and keep reciting it while driving. Tesla’s autopilot is a wanna be, as described by Volvo. It is trying to do too much with too little hardware. It is laughable that some people are trying to defend it in a case like this crash.

Here’s a good laugh for you:

Same as Tesla, except this is a controlled test, not real world driving. Tesla does it testing using real world drivers.


That was a product demonstration under completely controlled circumstances. Major embarrassment for Volvo.

… and again. The Tesla collision avoidance most likely did not fail — for all the reasons discussed, which you choose to ignore.

Not you obviously.

I always read my new car’s manual and for something complicated like AP I would consult it frequently until I had it totally down.

Next you will be saying: “now _I_ am resposible for memorizing which is the brake and which is the accelerator??”

Who reads the manual?

Better question, whats a manual?

I don’t have time to read through all the (usual) drivel in the comments here but – how do we know AP was actually switched on in this incident?


Issue isn’t about AutoPilot here.

Issue is that EMERGENCY Front Collision Avoidance system failed to engage sufficiently to avoid the accident. Whatever the antecedent — autopilot, driver inattention — the system failed to avoid the accident and brake appropriately.

Lol what a fail