UPDATE 2 – Tesla Model S Rear Ends Stopped Fire Truck – Drive Log Released

Tesla Autopilot crash


It comes as no surprise that a Tesla Model S ran into another stopped fire truck since the vehicles are not programmed to stop for stationary objects.

***UPDATE 2 – May 17 – Logs of the incident have now been released. We’ve embedded them (via Tweet) directly below:

***UPDATE – The driver of the Tesla admits to being on her phone at the time of the crash. She says Autopilot was activated.

Interestingly, if any other vehicle ran into a parked fire truck, we probably wouldn’t hear a word about it. However, Tesla‘s Autopilot semi-autonomous technology is under much scrutiny these days after the recent Model X fatality, followed by other recent incidents. Perhaps the more interesting part is that people are reporting that it is Autopilot related, while this information has not yet been verified.

Related: Tesla Autopilot And Other Braking Systems Are Blind To Parked Fire Trucks

Read Also: Tesla Says Autopilot Was Activated During Fatal Tesla Model X Crash

Reportedly, the Model S was traveling 60 mph and didn’t brake for a fire truck that was stopped at a red light. Police have no idea if Autopilot was engaged, but anyone driving 60 mph and not applying the brakes when approaching vehicles clearly stopped for a red light is a major problem.

Even if Autopilot was engaged prior to the incident, it was surely the driver’s job to come to a stop at a red light … right? The police statement said:

“Witnesses indicated the Tesla Model S did not brake prior to impact.”

The crash happened in South Jordan, Utah, a suburb of Salt Lake City. Authorities are still trying to figure out the cause of the crash and have said it may be days before they have any further information to share. South Jordan Police Department Sergeant Samuel Winkler reported:

“For unknown reasons, the Tesla failed to stop for the traffic at the red light and ran into the back of the Unified Fire Authority vehicle at 60 miles per hour.”

The only information the authorities have released thus far is that the driver didn’t seem to be under the influence. At the time of the crash, the roads were wet from light rain. The collision resulted in air-bag deployment and the driver’s right ankle was broken. The fire truck driver was unharmed.

Information from the driver’s statement to the police should be released this week. We will keep you apprised as more details become available.

Sources: KUTV, FOX 13, Business Insider

Categories: Crashed EVs, Tesla

Tags: ,

Leave a Reply

324 Comments on "UPDATE 2 – Tesla Model S Rear Ends Stopped Fire Truck – Drive Log Released"

newest oldest most voted

“Funny that a Tesla crashes and it makes the news”.
It’s all their fault. They shouln’t have hyped that crap beta technoly named Autopilot. And then there’s that nonsense that those cars are safest in the world. LOL

What does this wreck have to do with autopilot? I don’t think that info has been released yet, and the truck was stopped at a traffic light, where autopilot shouldn’t be used anyway.

I really hope we don’t start hearing about every Tesla accident.

Tesla rams into back of parked fire truck. No braking or speed reduction detected before impact. Driver was obviously distracted. Odds that AP was on while the driver was playing with his cell phone? Almost a guarantee. We should find out later today for sure.

The Bolt has adaptive cruise right? So do millions of other cars. They likely would have done the same thing and have done the same thing, but only Tesla gets any attention in this regard. This doesn’t take AP to do something like this. I mention ACC as steering was not involved in this case, it was more AEB and ACC aspects of AP.

No ACC on Bolt EV.

I have it on our i3 and I don’t zone out..ever. It doesn’t always work so I pay attention.

Of course you don’t zone out. You have to steer the I3. Kind of makes it hard to stop paying attention to the road.

You could easily just keep you foot flat on the floor off to the side if you were overconfident and using it wrong just like you could be looking at your phone with autopilot if you were overconfident and using it wrong.

Oh, that is rough, thought it was at least optional? Most makes offer it even on lowly models now.

It does have automatic emergency braking and pedestrian detection though. I wonder if it might have emergency braked for this prediabetes fire truck.

As long as it’s not fireman.

Wrong, but please continue the apology tour.

Actually, all adaptive cruise controls have an issue with stopped vehicles and false positives. Do a search on the technical reason, but something to do with fixed objects appearing (ie a car gets out of the way of a fixed object and the cameras being able to detect it). My Ford would sometimes just start braking while in an open lane. I suspected it thought it saw something, Teslas do it too. AutoPilot is just their brand name for adaptive cruise at this stage. IIRC, it warns you every time you engage it. Ford doesn’t, but it does disable in rain as it blocks the sensor.

Mercedes can stop for stationeray vehicles.


Only up to a certain speed, and then it doesn’t. Tesla says its system may not work with stationary objects over 60 mph.

Mercedes system works to 130 KMH Thats over 80 mph, another thing Tesla is behind on…

No, Mercedes SAYS it can….in most instances.

But very few vehicles on the road have BOTH auto steering and ACC. The combination of the two enables irresponsible drivers to not pay attention for longer periods and still ‘get away with it’. It is hard to disregard the road ahead of you for very long when you have to steer.

Fair point, but some of these people are candidates for “Darwin Awards” and Tesla is becoming the target even in their cases.

The boy in Ft. Lauderdale, had been issued a 112mph speeding ticket, recently before the horrific wreck…in a 30mph zone. The 21 y/o girl, in Utah, admitted to using her cell phone, through an intersection with Autopilot.

The “level 3” conundrum doesn’t look closer to a solution, anyway.

No, the reason I didn’t buy a Bolt was to wait until GM equipped it with ACC. The Volt, can be ordered with it but is hard to find with ACC. It’s hit and miss. Most Volts, Premier and LT do not have ACC. So to get your color with ACC is special order.

So it was the drivers fault then right? You can bet you ass that I don’t play with my cell phone with my ACC on while flying down surface streets at 60 mph.

This accident has nothing to do with autopilot. To claim otherwise would be like saying that if he was drunk and passed out it would be autopilot’s fault because even though the driver was negligent in being drunk and operating autopilot on a surface street that isn’t a divided highwy..well Tesla autopilot’s fault

No Autopilot or even the Intelligent Cruse control would have stopped the car here if either had been engaged. That works fine. What sometimes happen is autopilot warns you that place is not appropriate for it and dis-engages, but even when that happens, it keeps the cruse control on that would stop the car here. So for me this is only driver stupidity.

Even the Unintelligent Driver failed to stop.

I’ve been driving our i3 for 3 ½ years almost always with adaptive cruise control (ACC) on and it has never failed to stop for stopped traffic ahead. I am confident that our i3 would have stopped were this fire truck stopped ahead.

Twice our i3’s ACC has failed to recognize a white box truck traveling in the same lane ahead, and it occasionally brakes needlessly when driving on a windy road with vehicles parked along the curb. I am always prepared to take control, so if it fails to recognize a stopped vehicle ahead, I’ll be ready.

Box trucks seems to be a problem for Tesla autopilot, too. Once in a while it takes a lot longer to see and track a box truck vs other vehicle shapes. I think one has to be mad to not pay attention when on Autopilot. I feel like I pay a lot closer attention to the road when on autopilot because my mind does not have to focus on anything else.

Yes my i3 is way better at ACC than my Tesla.

And what other ASSumptions have you pulled out of your posterior this morning Mental MadBro?

You know, I think you missed your true calling in life–instead of being a paid troll by a sleazy GM Stealership you should have been an Ambulance Chaser since it better reflects your personality.

Of course, you could have never passed the Bar so I guess your chosen profession makes sense.

Just like I said, except it was a she. When will the madness stop?

The driver was drunk from Elon Reality Distortion Field and believed that her unique high tech gadget straight from the future can actually drive for itself and may even have functional AEB because “Elon said so!”. It is all driver’s fault, she was driving under influence /s

But a modern car would normally have an emergency braking assistant.

Emergency braking assist still requires the driver to actually hit the brakes. It just pre-pressurizes and boosts pressure to the brakes if the driver makes a rapid movement off the accelerator and on the brake.

Unless you mean collision avoidance, which is a different technology.

No AEB that the safe, modern, cars has stops automatically. Mercedes, Volvo, Subaru, etc.

And most exempt some situations like a stopped vehicle that was stopped when it was first seen.

Au contraire, several of them stop automatically. Not every time, not reliably – as they need to avoid false positives where the car would emergency brake when there’s nothing to stop for, which can of course create dangerous situations in it’s own right.

Plenty of demos have been done in car shows on telly, for many years now.

The surprising thing here, to me, I’d the author’s assertion that Tesla’s system isn’t designed to detect stationary objects. I find that difficult to believe.

It was Tesla’s own statement last time this happened:

“Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”

Volvo’s language is similar:

“Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed. The driver must then intervene and apply the brakes.”


It’s not that it’s not designed to detect stationary objects, it’s that it’s designed to ignore them due to all the false positives that would result if the unreliable detection system was relied upon.

We are a very long way from having a reliable SLAM system in semi-self-driving production cars. Autopilot+AutoSteer is an automated lane-following system with a limited ability to detect and react to large moving obstacles. How large? Elon Musk says moose-sized.

It’s not even remotely a full self-driving system.

“operates between speeds of 5 to 80 kph (3 – 50 mph)”

I assure you that my Tesla stops for stationary vehicles multiple times every single day.

AEB does not equal EBA

Two different things. Like lasagna and hot dogs.

All of them fail to detect sometimes, which is why you pay attention.

I have a 2018 Leaf with it, and when testing it out (with my foot hovering over the pedal), it works most of the time.

One time, however, the ACC failed to see the bigass truck in front of me (maybe the odd trailer didn’t reflect the radar well) and the *accelerated* towards it without the AEB kicking in.

It’s strange when it happens, but these systems do mess up sometimes.

Haha voted down. 6 people don’t understand the difference in braking technologies.

That is your wife, beside you, hollering “STOP!”

It’s also why our xB has the speedometer in the center… so she can tell me how fast I’m going!

Wife Pilot. 2.0 for those who have remarried.

No production car has an automatic braking system designed to react to stationary objects, period.

If they did, they would have so many false positives that the car would be constantly stopping and the driver would soon shut it off out of frustration.

As the article correctly says: “the vehicles are not programmed to stop for stationary objects.”

The problem isn’t that Autopilot doesn’t work as designed. The problem is that some people assume Autopilot is a fully functional self-driving system, despite all the very clear warnings Tesla puts in the owner’s manual and onscreen every time AutoSteer is activated.

Let’s blame Autopilot for everything! 🙄 Did you get pregnant? Blame Autopilot!


You can use pro pilot and Honda Sense on loca roads, they have stop and go

They have the same exemption that it won’t see a vehicle that was stopped when it came into “view”.

Nonsense. This test is explicitly performed in the 2018 Euro NCAP and all the upmarket cars can do it.

Yes, but at slow slow speeds!

70 km/h, I think. Mercedes claims up to 100 km/h to avoid collision.


No, both high and low speeds.

Mercedes says its system doesn’t work over 80 mph, Jaguar says its only works up to 50 mph.

You’ll hear about any accident where a normal car with AEB would have stopped, but this thing plowed headlong into large, solid objects.

I challenge you to find an authoritative citation backing up your assertion. No production car traveling at highway speed would have automatically stopped for a vehicle in its lane.

It’s just bizarre how perfectly ordinary traffic accidents involving Tesla cars get reported as news. If it was literally any other brand of car, we wouldn’t be having this discussion.

The anti-Tesla trolls certainly are succeeding with their FUD here! 🙁


“where autopilot shouldn’t be used anyway”

Yep. Tesla needs to geofence the solution to supported roadways.

60MPH and the driver survived???? There’s your headline!

Agreed, but I don’t think a 60MPH impact is actually survivable because your internal organs wouldn’t survive the deceleration, nor does the damage shown in the picture suggest such a catastrophic impact. I don’t think things went down as the story suggests.

It depends on how rapid the deceleration is. Air bags and crumple zones increase the distance over which the occupant is decelerated, and therefore decreases the force of acceleration.

One of the reasons Tesla cars are so safe is the improved front crumple zone.

And indeed, the real story should be “Driver survives after hitting fire truck at 60 MPH!”

In modern era, all 4 star and 5 star vehicles are expected to achieve similar results.

It is fire truck that shifted, not immovable wall. Big difference in deceleration distance.

Autopilot was engaged at the time of the accident, driver looking at cell phone.

It has now been CONFIRMED that the car was using autopilot. Autopilot is clearly being over hyped and over sold as much closer to self driving than it is.

Autopilot has definitely been released I’m not sure where you have been for the past few years. The problem with autopilot is the name… Tesla should not have called it autopilot because people mistakenly believe that is is full autonomy.

Tesla says autopilot driven Teslas are safer then human driven Teslas.

It does not matter how many people die in Autopilot crashes due to Autopilot fault, as long as its less the number of people who would die in crashes due to human fault.

Its cold logic and sound facts, not ape-like emotions that decide. Well that is, if you care about actual safety…

“Tesla says autopilot driven Teslas are safer then human driven Teslas.” and big tobacco and big oil and monsanto and blah, blah, blah. think for yourself.

How about this: Stop Chasing Headlines!

It should read: “Tesla Owner Ignores Red Traffic Light, Survives Crash Into Stopped Fire Truck!” Sub head: “Only Injures Ankle, While Half Way Into Truck!”

Or: “Stupid, Ignorant Man Somehow Able To Buy Tesla, Unable to Make Next Payment, Crashes Car To Avoid Embarrassing Loss Of Job!”

Are autopilot Tesla’s safer than similar cars from other manufacturers (ICE or not), that do not tout “autopilot”. That’s the interesting question.

More interesting. But really if the question is whether AP works, scientists should study the data to control for whatever is possible to control for. I imagine the best would be to compare Teslas on the same stretches of road with and without AP engaged, over a long time period with varying conditions, and look at accident rates as well as the seriousness of the accidents. But it’s always hard to infer causes. For example, it could be that the”AP disengaged” group performed better, but because they are more careful and drive very slowly! Perhaps the “AP engaged” group fared worse, but would have fared worse still without AP?! How can you know for sure? (They’d probably use personality tests or some other method of measuring risk-taking tendencies, but it’s easy to see how this quickly can become rather demanding in resources, and thus expensive.)

Tesla however hasn’t let anyone review the data, as far as I know. Maybe nobody’s asked either. But I do think it’s unreasonable to claim to know the answer, especially if you’re not at Tesla and thus must be doubted by default on this matter.

Researchers did ask to review Tesla’s data regarding their Autopilot claims. I don’t if it was NHTSA or NTSB but they checked with Tesla and Tesla refused to release the information.

“Tesla says…”
Stopped listening to you after that. Anybody who quotes a Tesla statement as ‘fact’ needs to review the history a bit. You can trust what they say as much as you can trust FB saying they didn’t’ use your data without your permission.

What Tesla says about their own cars and their own safety systems is certainly more relevant, and better informed, than some random guy posting a comment to social media.

That doesn’t mean we should believe everything Tesla says, but ignoring what the best-informed source of info has to say would be pretty stupid.

Ok, I might be safer, but what about the crew of the firetruck? Tesla needs to demonstrate that autopilot works on stationary vehicles.

What a strange thing to say. It’s kinda hard to demonstrate the car doing something it’s not designed to do! It’s not designed to jump over rivers, either.

Ron Swanson's Mustache

It’s all cold logic and sound facts until your car is the one that suffers an autopilot failure.

First of all, “Tesla says” isn’t good enough. Big tobacco said most doctors recommend Camel cigarettes. And doctors are on average more driven by desire to do good than business people (who run companies, including car companies).

Second, even if the numbers were known, it’s not necessarily that simple. AP and humans may kill in very different ways, putting different people at risk. The drunk driver who runs off a cliff in the middle of nowhere in the middle of the night poses a different risk to the car with a sober but absent-minded driver speeding through the residential zone just after school ended.

And even if AP were so much better that nobody could want to ban it, that doesn’t mean it can’t be improved. If humans are much better in some situations, and we can force them to drive themselves in those situations, your “cold hard logic” days we not just should, but must do so.

Trying to equate Tesla with Big Tobacco?

WOW! Tesla bashing has reached a new low. Congratulations on your “achievement”, I guess??? 🙄

If a close family member died in a crash verifiably caused by AP, are you saying that you would not find fault and not sue Tesla? Would you accept that your loved one was still statistically safer and conclude they were just unlucky? I suspect not – despite your logical approach.

To be clear, accidents aren’t “caused” by AP. Unless AP ripped the steering wheel out of someone’s hands and drove off a bridge.

Not preventing an accident is very much not the same as causing an accident.

Yes, if a family member died in an accident that *they* caused and AP didn’t prevent, I would not sue Tesla.

This driver crashed his Tesla at 60mph directly into a functionally stationary object, 60-0 in 0.05 seconds, and limped away.

Seems there’s some evidence to support the claims of safety.

At least the battery didn’t catch fire! As the truck’s rear end sits higher than a regular car, there was likely no battery intrusion despite the huge impact.

Nope, it didn’t catch fire unlike your mental trolling here.

Safety would have been if the car would have recognized that it was about to hit something and brake, just like any normal vehicle with AEB would have done.

If the car crashes you have missed quite a few steps on safety already.

Naw! It won’t Brake, if you keep pushing the Go Pedal! Check my revised headlines commented above for other ideas!

But Tesla is not a self-driving car. Hopefully, all cars well be self-driving and save more lives with fewer injuries.


Your expectations do not match contemporary reality. In general, cars are not equipped to do what you say.

Cars as expensive as this Tesla are.

You claiming it’s so does not make it so.

Give us an authoritative citation that shows any production car’s ABS system stopping a car moving at highway speed in response to a vehicle parked in its lane.

ABS systems are not designed to do that.

“…brake, just like any normal vehicle with AEB would have done.”

I challenge you to find any authoritative source stating that AEB in any production car will work at highway speed in preventing collision with a vehicle parked in the lane.

The level of ignorance being expressed here regarding automatic braking systems is almost breathtaking! If you’re going to trust your life and limb to a car’s driver assist systems, then you should have a better understanding of how they work.

Why do you keep spamming that video? There’s no evidence that it will avoid collision every time, no statisistics from real world use, nothing.

And who are the idiots rating you up?

I’m impressed when you can limp away from an accident like this one!

All modern 4 and 5 star rated vehicles are expected to deliver similar results.

60-0 in 0.05 seconds? Let me make a wild guess: you’re just making a wild guess.

At this rate, the government will have to force Tesla to change the completely misleading Autopilot name. They are too stubborn to do it themselves, that’s for sure. Elon will still go out of his way to claim the FUD that AP is safer than no AP.

You claiming that something about Tesla’s cars is “FUD” is all we need to know that what Tesla says is true.

No Pushi – Bro1999 is definitely onto something here. Now myself, something is happening at Tesla, I don’t know whether for good or bad:

Item: Their design engineer is ‘recharging’ (extended leave)
Item: More than the normal turnover is happening with upper management.
Item: Why are no other EV’s catching fire at the rate that Tesla’s are?
Item: Now Sudden Vehicle Acceleration is being reported with the Model 3. Why aren’t other ev’s reporting the same thing?


Of course it is AUTOMATICALLY the driver’s fault, even though we are told that the average Tesla buyer is of ‘Above average skill and intelligence’ (and also more well-healed).

I know that legally, the $5000 (Or whatever it costs these days) AUTOPILOT is no such thing. But I can’t help believing that in normal daily use, many otherwise intelligent Tesla owners seem to think that the Autopilot is indeed an Autopilot. I have no idea where they would have gotten that idea.

I think, unfortunately, it’s gonna take an AP-enabled Tesla with the driver playing on their phone being involved in an accident where innocent bystanders are killed (occupants of vehicle being crashed into, etc…) before anyone does anything, gov’t or Tesla. That’s usually what it takes for people to finally have the political cover to take action. Sad, but true.

No, but when it becomes a school bus full of children, it’s normally the death nel
Incurring re-evaluating technology

Why do you think crash avoidance technology in 18 wheelers is almost standard equipment these days?

Where was the term Autopilot most used before Tesla came around, genius?

Airplanes on autopilot don’t avoid objects, don’t land, etc.

There’s no legit reason to think Autopilot means you don’t have to pay attention and supervise.

In bad weather passenger planes have to be landed by the Autopilot. Humans are not allowed to perform the landing.

Onto a verified cleared runway, thank you very much

Another dim-bulb insulting me who is clueless. Autopilots in Planes function as the Pilot expects, and he is fully aware of its limitations. And it essentially does in most cases free the Pilot from actively paying attention for an extended (i.e. 30 seconds) period. This case amoungst several proves you cannot be distracted for 30 seconds.

You’re the halfwit with no concept of what autopilot means.

Autopilot on planes has NEVER avoided obstacles. If you set it on a path where there is a stationary object, the plane will fly right into it. If you autoland on an occupied runway, the plane will smash right into another.

You poor people that are so mean spirited. I’ve seen enough AUTOPILOT ENABLED collisions to obviously see what the drivers THOUGHT the car would do. Only legally, it does not do that, and that is the only dung hill you guys have remaining to stand on. Don’t you have anything better to do then throw your silly crap around? I like talking about electric cars and the technical details surrounding them. I’m not here to placate malcontents who have no sense of decorum. When people insult me here, seeing as this is the ‘neighborhood’ i’m in, I just insult them back. Rather like when someone throws a snowball at me, I don’t let others gang up on me, I throw the snowball back like any Red Blooded American Would. Perhaps that is why I don’t comment here as often as I used to since its becoming much too juvenile – and: Congratulations Mint – Its because of hateful people like you that make many people decide against EV’s if they think that you are representative of the creeps that drive them. You leave that SORRY an impression to newcomers. I’ve only driven EV’s for the past 7 years. But one… Read more »

Here’s’ the core reason why–

https://www cnbc com/2018/05/14/tesla-engineers-wanted-more-sensors-on-cars.htm

Bingo… Supercruise equipped cars have not been banging into medians and firetrucks.

Wrong. At highway speed, such cars are designed to react to moving objects, not stationary ones.

You’d better learn more about how Supercruise works if you’re gonna drive a car equipped with it.

Supercruise does not work in areas with stop lights, and does sense stationary objects. Supercruise also tracks your eyes constantly so you cannot look away for long, and it will slow the car…

Supercruise cars are also equipped with LIDAR…something Elon is on an island on saying is not necessary for self driving.
I bet he questions that thinking every time another Tesla crashes into another concrete barrier/fire truck. Lol

SC does not have on-board lidar. It uses lidar-generated static maps. Way different.

That’s because you can’t enable super cruise unless you’re on a highway.

Oh and there’s only like 100 cars out there with super cruise, not 300,000. So there’s that.


There’s more than enough blame to go around. Yes, the driver was an idiot and deserved that broken ankle, and I hope they have to pay for the damage to the fire truck. But I still can’t fathom why the radar system in the Tesla couldn’t see a stationary bloody fire truck in its way! I mean who programmed this thing, Tyrannosaurus Rex? The whole job of all of the car’s sensors is to continuously build up a 3D model of the environment and to decide where to steer. If the car decides, based on its model, that there’s no possible path forward, it should brake for all it’s worth before it hits anything! Anything less than that is criminally negligent engineering, IMHO. If the system can’t do that, they should pull the product from the market until it can.

“The whole job of all of the car’s sensors is to continuously build up a 3D model of the environment and to decide where to steer.”

That sort of SLAM system is what will be needed for Level 4 (or perhaps even Level 3) autonomous driving. Nobody has that in their production cars yet. Waymo apparently has it in their test fleet.

Tesla Autopilot does not include a SLAM system, any more than any car equipped with TACC or ABS. Autopilot+AutoSteer is an automated lane-following system with a limited ability to detect and react to large moving objects; nothing more.

You’re still assuming AP was even enabled. Let’s wait for some facts before the finger pointing starts.

In the end assistive technologies are just that. Nothing is replacing a driver. Responsibility for safety falls on the driver first, by law, in every state.

So the guy crashes at 100km/h into an immovable barrier and only breaks his ankle. Seems pretty damn safe to me.

A fire truck immovable ? LOL

It is a she. And that is expected for all 4 and 5 stars rated vehicles with modern safety features.

The guy that crashed into a wall at high speeds in a Tesla burned to death in his car (along with his passenger) because the battery exploded into flames immediately after impact.
I know, with all these Tesla crash/battery fire articles lately, it’s easy to get them confused!

Do you know this for a fact or are you just making a false assumption?

Like you did with the previous accident, where the driver was extracted from the car before the fire started, despite you claiming instant fire?

To say it’s “all their fault” is obviously taking it much too far. But does AP tech cause some accidents that otherwise wouldn’t have happened? Almost certainly. Does it prevent far more than it causes? Tesla claims so, but oddly enough shares its patents more willingly than the raw data, so there is no independent third party analysis.

Doesn’t matter if AutoSteer was on or not, just Blame Autopilot for everything! Yeah, that’s exactly what the serial Tesla bashers do.

Did you get pregnant? Blame Autopilot!


I posted this comment two days ago and got a thumbs down under the first deliveries to Canada post with no comment apart from the text below.
Oh no not another Tesla crash:

It makes news because all the other cars running into stationary objects are driven (badly) by their occupants. Tesla is claiming ‘Autopilot’, people are using it at its word, and the system crashes.
Also how is “vehicles are not programmed to stop for stationary objects” possible, such a programmatic lack of functionality is ridiculous.

Autopilot does not stop the car with stationary objects, you can read it in Tesla Manual, the problem is tha ignorant people like you buy the cars and them do stupid things.

Russian Troll elections are coming move on.

“Interestingly, if any other vehicle ran into a parked fire truck, we probably wouldn’t hear a word about it.” *We* wouldn’t hear about it, but the driver, if they survive, would have a court hearing and their insurance would likely spike if they were found negligent. If they had done that after knowingly impairing their senses with alcohol or drugs, they could be looking at a crime and jail time. These were issues that were hotly debated in the late 1990s, when the first autonomous vehicles made their appearances at CMU and Stanford campuses as research projects. Liability and inability to attribute cause were the number one reasons why the technology did not move forward for civilian projects back then. The situation is the same today, except that in the 2010s, we reward the kooky marketing genius who rushes headlong with a product and punish others who are focusing a lot more attention on waiting till the underlying technology matures. This will all take one big lawsuit to put to rest. I’m so not a fan of the way autonomous tech has been marketed these past couple of years. This is not a problem unique to cars. As Ali Rahimi’s… Read more »

Nope. I went nowwhere because computers where inefficient and slow back then. Even nowadays, chips for autonomous driving are top of the shelf and packed with newest tech and…. Still hardly capable of doing for fire truck. 😉

I remember when the folks at the local Tesla Service/Sales Center were egging me to sell my Model S and buy a new one with autopilot. The sales team is notably young. I’ve sold a lot of Tesla vehicles over the years and it hasn’t been because of autopilot, once. Roughly speaking, I know how this AI works. Or that AI. Or Waymo’s. And yes, it is exactly because of the DARPA Grand Challenge. I thought it was quite impressive that some geniuses from Stanford got a CUV to cover a medium distance in a couple of hours, off-road. It was adaptive, but static problem-solving. There was LIDAR. There were cameras and emitters. When all this autonomous drive tech. started leaking onto the roads, I had considerable doubts. The one thing that stood out as a possible win was parallel parking because that is again a static problem with adaptive systems. Then, famously, someone took a BMW i3 from a dealership and rashed the rims. Two things stand out: the geniuses underestimate the human brain’s adaptive capabilities in dynamic situations, and I have statistically underestimated how many drivers are distracted or impaired and plow into solid objects. I hope everyone… Read more »
Strongly agree. I’m a looooong time programmer, and I’ve been saying for some time that AV tech is and will continue to be the most oversold and misunderstood (by the consumers) feature in a mass market product in some time. As I keep saying, I live in the NE US, and within any 6-month period I typically encounter numerous events on highways that any current AV couldn’t handle well and in some cases not safely. What any car can do today without [1] a lot of V2V information sharing and/or [2] heavily instrumented (not merely properly marked) roads is severely limited. Talk to average consumers and you’ll find that they think something called “Autopilot” can do a hell of a lot more than it can. Many I’ve asked about this literally think it’s very close to “I get in my car, tell it to take me to work, and I can browse the ‘net while the car drives.” And the argument that, on net, it’s safer than the idiots we have piloting cars now, misses the point that heavily reliance on AV tech before it’s ready for prime time, especially in a mix of human and computer piloted vehicles, means… Read more »

The best, and best-informed, comment in this discussion. Thank you! 🙂

What would it take to add a V2V sensor in all existing vehicles so the new technologies could recognize their motion or lack thereof? Seems like the only way out to me.

I have called it a Nervous Teenage Driver, too, but not Quirky, just not Confident, or really Capable, unless things are consistent, and familliar! Still needs a Driving Instructor watching very closely (The Actual Driver)!

“Tesla’s AI drives like said teenager from the West Coast who has no concept of snow, large autumn leaves, or heavy rain.”

This is just contributing to the general misconception that self-driving cars perceive and understand the world like humans do, or that they ought to.

Autopilot+AutoSteer isn’t even remotely like a 15 year old human driver with a learner’s permit. There are a lot of things which a normal teenager driver, no matter how poorly trained, can do quite easily which are impossible for Tesla’s very limited semi-self-driving system.

That 15 year old, if he even glances at the road, will see a fire truck parked in his lane as a dangerous obstacle to progress, and will instantly understand that he has to control the car (with steering and/or braking) to avoid a very dangerous collision. Autopilot isn’t designed for that, not even remotely.

You’re right, I’m not giving teenagers enough credit. However, when you drive with autopilot, you have to behave like there’s a teenager at the wheel and it’s your car.

I do feel that this particular accident, the autopilot must have frozen in fear and forgotten where the brake pedal was. Or, as I said, its collision eyeballs were below the front license plate and did not have sufficient visibility and completely missed the fire truck in the FOV.

What bothers me the most is that Tesla will likely fix this issue with another algorithm for “trucks with very highly-positioned rear bumpers” rather than address the overall intelligence issue. How would a self-driving system fair against a station wagon with a 12′ ladder sticking out of the tailgate?

“…we reward the kooky marketing genius who rushes headlong with a product and punish others who are focusing a lot more attention on waiting till the underlying technology matures.”

If we wait until self-driving cars are “perfect”, then we will be waiting forever. There is no logical or sensible reason to wait until tomorrow, when even limited self-driving systems can be saving lives today!

Go Tesla!

“It comes as no surprise that a Tesla Model S ran into another parked fire truck since the vehicles are not programmed to stop for stationary objects.”

Why are they not programmed to stop for stationary objects? If the stationary object is in its path, and it is not steering to avoid it, stopping seems like an obvious programming requirement.

I think it is more complicated than that and comes down to avoiding slamming on the brakes every time you drive under an overpass or drive by a parked car on the street. Those systems need development still, pretty much every auto braking system has those same limitations.

AEB works pretty good already, this would not have happened in a Volvo or Mercedes today.

Exactly. The Tesla Kool-Aid drinkers have no idea how good the Volvo, Cadillac and Benz systems are.

this is a very common accident scenario– and real car companies have solved it.
Meanwhile Mr. Tweeter continues to deflect.

> Exactly. The Tesla Kool-Aid drinkers have no idea
> how good the Volvo, Cadillac and Benz systems are.

Volvo, Cadillac and Benz systems are nowhere near as good as Tesla Autopilot. I am on autopilot 90% of the time in my Tesla. I don’t think any Volvo, Cadillac or Benz driver can claim that.

But they can claim that their cars don’t crash needlessly into stationary object.

> But they can claim that their cars don’t
> crash needlessly into stationary object.

And you know that they don’t because you have not seen it in the news?

That is because they have AEB.

Well, I remember the YouTube video from a Volvo demonstration of AEB. Poor dude got run over right in front of the press. And somehow that still didn’t make news and only spread slowly on YouTube years later.

I’m pretty certain there have been failures in these cars as well. But maybe their drivers pay more attention. This appears to be a uniquely American problem, and I see plenty of YouTube clips in which Americans seem to think their Tesla can drive itself really well. Maybe it’s the big screen and hi-tech brand image? If people really believe AP is reliable, it will be much more dangerous than if they had a more realistic idea of how sophisticated it actually is, and how counterintuitive its weaknesses may be. Going head-on into stationary objects is after all something humans find easier than staying in the middle of the lane (though I’m told it doesn’t do that so reliably either).

It was with a car that did not have AEB at all, it did not spread because it was not the fault of Volvo but some idiots trying to test a function without actually buying a Volvo that had that function. Brilliant. 🙂

Read the description of the video. AEB had been disabled on that car.

“But they can claim that their cars don’t crash needlessly into stationary object.”

Are you seriously trying to claim that cars from those auto makers never crash into stationary objects? Are you actually that clueless?

If you are, then you should never be allowed behind the wheel of a car!

No. The Mercedes S class can only avoid the impact altogether at speeds up to 100km/h. Above that it might crash into the stationary object but at greatly reduced speed.

How exactly do you know how good these systems are? Where are the tests and statistics?

I have a 2018 Nissan Leaf, and it’s AEB once failed to identify a truck I was nearing. This isn’t a Tesla-only issue.

Show us some data or we’ll all know this claim of yours is nothing but BS.

Tests have shown this…when?

This accident, and it remains to be updated as to whether or not the system was engaged (but let’s assume) shows the same issue that cost Mr. Brown of Florida his life: the sensor is quite low for radar, below the bumper. I really think the analytics should have picked up the relative shape of a high backed vehicle. Yes, it’s not flat and vertical. Same with garbage trucks. The AEB capability is relying on one set of sensors and not all of them and is not aware of its own collision box. If you shoot a beam out from the top of the car and head straight forward, the distance to the target is, say, 20 feet. If you shoot a beam from eight inches off the ground, you might go all the way to the rear axle. The ultrasonics should have picked up the shape, but that’s not in range for a sudden stop. The camera also should have picked it up, but if they aren’t executing stereoscopic algorithms, that ain’t gonna get you useful shapes at large distances. Elon prattled about how LIDAR is unnecessary, and while true, that’s only conceptually so. Your code does not reflect… Read more »

That has nothing whatsoever to do with this accident. The low-resolution Doppler radar sensors which various auto makers use for their ABS systems, and Tesla uses for Autopilot, are not designed to detect stationary objects.

What part of Tesla’s autopilot is designed to detect stationary objects? And what is the doppler radar designed to detect? There are three sensor suites in Tesla autopilot.

“AEB works pretty good already, this would not have happened in a Volvo or Mercedes today.”

Flat wrong. If there is ignorance of how Autopilot+AutoSteer works, it seems there is even more ignorance of how AEB works… or rather, doesn’t.

You know, you’ve posted this link 8 times already. I think that’s enough (-;

The only ignorance is the one you are showing. Why are you so desperate to try to defend Tesla when their errors are so blatantly obvious?
And why are you trying to put down their competition when you know that there are things they do better.

It’s just sad.

No, he’s right. There is a ton of ignorance on how these braking technologies work on this thread.

Literally every manufacturer can “do better”, but safety is first and foremost the onus of the driver, every time. If a manufacturer complies with the law on safety they’ve done what they need to. Any manufacturer that goes above and beyond the law in the name of safety should be applauded.

It’s not “an error”. The driver made the error, full stop. Get over it.

That’s where Lidar would help to let you know if the thing in front of you is actually in front of you and at what distance and height.

Yes, it must be because it’s too hard. But what does that tell you about the state of the technology?

I really think AP relies very heavily on mapping and recognizing the lane markings. It’s not all it ever does – there was that happy story where a Tesla saw trouble two cars ahead on the highway and began braking before the car directly in front, like a vigilant driver. But that was perhaps AP 1..?

As I keep saying, it is obvious this system is not ready for Prime – Time. Legally, Tesla says the same thing since you have to pay SERIOUS UNDIVIDED ATTENTION every second you drive the car. Sounds like the way I drive my car, and I saved $5000.

The crash avoidance systems on class 8 trucks the object has to be in front of you in your lane, it tracks as far out as 400’ and applies braking accordingly.

You can drive past overpasses and parked cars all day long, but if 4 wheeler pulls in front of you to soon after passing you, you are going to be in automatic braking mode

I don’t necessarily believe that it isn’t programmed to stop/slow down for stationary objects, considering Autopilot ‘sees’ vehicles in front of it that slow to a complete stop, which it responds to in kind.

Yeah, this is more like I would’ve expected too John. Autopilot should respond to traffic jams, etc. and slow down and/or stop, like what you’re suggesting. To be, it does come as a surprise that another firetruck was rear-ended because of this.

These systems can all track a moving vehicle to a stop, they have issues with stopped vehicles that weren’t being tracked. I think primarily from lack of resolution on Radar, Lidar can resolve the detail or cameras could too. The image processing has to see the truck.

Mercedes has solved this problem, and their system works from 30 to 200 km/h.

This is not true with Tesla Autopilot. It stops for cars stopped in your lane, no problem.

Cars but not fire trucks?

> Cars but not fire trucks?

It is possible that there may be a problem with recognizing certain unusually shaped vehicles, that’s possible. I have noticed that my Tesla has trouble with box trucks and lanscaping trailers, although it does recognize them eventually, just not as early as vast majority of other shapes. I do not remember if my Tesla has ever had to stop behind a fire truck, though.

Also, we do not know even if the autopilot was on in this case. We also don’t know if this particular car had up to date firmware. Tesla has made tremendous amount of improvement to the autopilot over the last few months and even the last few weeks.

It should be able not to drive in a stationary object no matter what its shape is.

Perhaps it only recognizes SEXY cars.

No, it stops for cars which Autopilot “sees” as slowing and stopping. It’s the change in velocity which Doppler radar is designed to detect.

Not vehicles parked in the lane, which are ignored just like all stationary objects are ignored.

The problem is, that non moving objects, especially big ones can also just be background “noise”. So if adaptive cruise control would hit the brakes every time it saw a stationary object, you’d have tons of false positives.

I wonder why AEB didn’t try to slow the car down a bit, though. Usually those have their own radar input, so the car should have hit the brakes. But not even AEB is fault proof.

“Autopilot” should at the very least be geofenced so you can only use it on highways as it’s really not intended to handle stop lights and so on so why even allow anyone to use it in those environments.

ACC isn’t geofenced either and it has the same limitations as autopilot on surface streets with tons of stopped cars that aren’t being tracked already. ACC cant reliably detect stopped cars that weren’t actively being already tracked while moving along with traffic.

This accident happened on Bangerter Highway which has a few stoplights on it. It’s a dangerous road.

Especially if you are in a firetruck with a Tesla behind you…

What would happen to any other car and it’s occupants if it ran into a parked fire truck at 60mph?

As long as it’s not a 1990’s KIA, they’d probably survive too assuming everyone was wearing their seatbelts. 😉

If it had AEB, the occupants could walk away.

Nothing cause the truck will absorb most of the energy

A brick wall will absorb most of the energy when a car hits it at 60 MPH, too. Doesn’t help any at all, now does it?

The question isn’t what absorbs the energy. The question is how abruptly the human body is brought to a stop, and whether it encounters something hard or sharp along the way.

Air bags and crumple zones reduce how abruptly the human body is brought to a stop in an accident. Tesla cars, with their superior front crumple zones, are among the safest cars on the highway when it comes to front-end collisions like the subject of this article.

“A brick wall will absorb most of the energy”. This shows you know Nothing about STRAIN ENERGY. A brick wall (that remains intact) will absorb extremely little energy. Clueless about Physics and engineering. In my younger university years I was given plenty of homework problems to calculate the strain energy received (as well as the resultant heating) by normal mechanical items such as reciprocating compressors, vibration tables, etc.

It is unclear what damage accrued to the fire truck since I haven’t seen detailed pictures of the contact area. But any buckled parts adsorbed much energy. Presumably the fire truck didn’t move much, so there is little kinetic energy transferred to the fire truck, but a considerable amount of Strain Energy would have been adsorbed by the fire truck’s ‘flexing’ tires due to the stress imposed.

More PUSHI Drivel. “The question isn’t what absorbs the energy.” and, “Air bags and crumple zones reduce how abruptly the human body is brought to a stop.” While doing this they are converting kinetic energy into heat, and contrary to your nonsense, are indeed absorbing the energy.

More idiocy: The fire truck didn’t move much. Therefore, assuming a ‘smooth deceleration’, ABSOLUTELY ALL of the Tesla’s kinetic energy (and that of the woman traveling the same speed) had to be removed in a split-second. At the ‘completion’ of the collision the kinetic energy of the TESLA is effectively zero.

Dim-Bulb lesson #1 : If the car or fire truck DOESN’T absorb the energy of the High-Speed woman, then she herself is the only thing left to absorb it. If it is of any significant amount, she will die.

I find it unfortunate that folks are so quick to dismiss a technology based on the exception and not the norm. Used properly, Autopilot is an outstanding tool. Used PROPERLY. I say that because it still needs the driver to pay attention. Just like regular cruise control, and adaptive cruise control. If a driver can’t handle the responsibility then they can simply disable the feature under the ‘controls’ within the car. That being said, Autopilot is the first real big step toward the inevitable autonomous driving. As I’ve said before, perfect is the enemy of good, and it isn’t highlighted anymore boldly than with Autopilot. Every technology has to endure growth to reach the ‘perfect’ stage, the process is the same every time. Unfortunately, naysayers do not want to entertain the benefits that will come from autonomy and give it time to reach maturity, only focusing on the downside, which again like all technologies, exists. But the upside is conveniently left out, which includes preventing accidents like I witnessed last December when a geriatric man entered the wrong direction on the highway I was traveling on and took out the vehicle directly behind me. Never made the news. But the… Read more »

Market it as “adaptive cruise control” the way the germans do and there will be no issues. When you’re marketing it as “autopilot”, you can’t fault people for not reading the fine print. It’s irresponsible marketing.

While I agree that the term can be misleading to some (or many), I feel the obligation lies with the purchaser of the product to understand how the product works. Maybe a solution would be having some kind of ‘training’ that the purchaser has to complete or read through before Autopilot is activated within the car. If it’s a matter of simple ignorance of how the system works and it’s limitations, maybe that should be addressed. Although it’s already an unwritten understanding that vehicles are massive objects that are very dangerous to control at times, which many folks already dismiss.

Tesla already does this. The owner is required to read and check a disclaimer the first time before autopilot is used. Usually happens at the time of pickup and a rep goes over it with the customer. Then when it’s activated each time there is a screen message that reminds the driver to keep their eyes on the road and hands on the wheel at all times.

Obviously, this is a case of
“Not Good Enough!”

Tesla, could, and I think, and have shared before, SHOULD have “Tesla School”, with “Autopilot 101, 102, 201, and 202” level courses! You can not even Oder Tesla Enhanced Autopilot, untill you pass “Autopilot 101”, and can’t even order Full Self Driving, until you pass “Autopilot 102!”

Then, to take delivery, you need to pass “Autopilot 201” for the ACC or Enhanced Autopilot feature, and must pass “Autopilot 202” if you ordered Full Self Driving (which really is the biggest naming problem)!

Each course, covers Classroom Time of 2-4+ hours, with Lectures, Videos, Manuals, ant Tests! That is then followed up by Vehicle Time of Training, of 3-6 Hours, in separated 1-2 hour events per day, allowing time for stuff to sink in, or for you to show you can’t remember “$hit!” A passing grade, is only given when a score of 100% is the result, but you can retake the course, or take it with different instructors!

That’s not what I meant. Having a basic tutorial on do’s/don’ts helps Tesla achieve ‘due diligence.’ I always love folks who enjoy smashing courteous discourse.

It is unwise to make the lowest denominator the standard – that way progress would be much harder. By your logic cruise control is dangerous too – it doesn’t control my “cruise”, it simply keeps it at a level!

You actually can fault them. Because when you buy a Tesla you don’t get autopilot activated until you are told it’s limitations and read and check the disclaimer. Then everytime you activate it on the road the screen gives you a warning reminder to keep your hands on the wheel and pay attention to the road at all times. Then if you don’t you get audible and visual nags to remind you to do so.

So you can fault people for willfully ignoring the big flashing print that reminds them constantly that they are using it wrong. These are the same people who fasten their seatbelt behind their back so they can avoid wearing their seatbelt and not have the warning ding sound as they drive

The name is maybe giving people too high expectations, yes. But it is actually a bit like an autopilot in an aircraft – which also requires constant human monitoring.

Nissan calls it ProPilot, which seems to me to suggest rather more capability than autopilot. But they’ve been more restrictive about letting go of the wheel and “nags” people when they are abusing the system, telling them they’re not using it correctly. I’ve seen multiple YouTube reviews already where Nissan gets criticism for the system requiring the driver to have at least one hand on the wheel. To my mind, this shows that people do not really want to use the systems the way they should be. And maybe that means the systems aren’t really good enough for public roads yet.

When it comes to road safety, it’s not what happens when everyting and everyone is working according to spec and make no mistake that’s important, but what happens in the real world.

John said:

“Hopefully the powers that stand to ultimately lose because of the advances in autonomy (insurance carriers, trucking industry, taxi lobby, the uninformed/ignorant)…”

John, I appreciate your very thoughtful and perceptive essay. But I think you are mistaken to say that insurance carriers oppose self-driving cars. As I understand it, insurance companies are among the forces pushing hardest for development and deployment of autonomous driving tech.

Interesting Note: The Utah Governor just signed in to law H.B. 369 that now allows the direct sale electric cars by manufactures of electric cars.

While CT rejected similar legislation that would have granted Tesla the ability to sell cars directly in that state.

So would that apply to NISSAN and GM and others with EVs, or is this a Handcrafted Tesla only law?

Does this state clearly “Battery Electric Vehicles”, or “BEV’s” 😀, or is it just vehicles with batteries and a plug?😳😱😭

So how is this different from when I’m following a car in front of me and that car stopped in which event the auto pilot is supposed to bring me to a complete stop behind the car in front of me? Just that auto pilot ‘saw’ that car coming to a stop? If that’s the case, they should really “PUT THAT ON THE BOX”.

It is on the box. Problem is people don’t read it. Tesla should geofence it to supported roadways and conditions.

Autopilot limitations from the manual:

“Traffic-Aware Cruise Control is a beta feature.
Traffic-Aware Cruise Control is primarily intended for driving on dry, straight roads, such as highways and freeways. It should not be used on city streets.

Warning: Do not use Traffic-Aware Cruise Control on city streets or on roads where traffic conditions are constantly changing.

Warning: Do not use Traffic-Aware Cruise Control on winding roads with sharp curves, on icy or slippery road surfaces, or when weather conditions (such as heavy rain, snow, fog, etc.) make it inappropriate to drive at a consistent speed.

Auto steer is a beta product.

Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver. When using Autosteer, hold the steering wheel and be mindful of road conditions and surrounding tra c. Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present.”

Hence why I think Tesla School needs the AP courses I mentioned above, as requirements to both order, and to take Delivery of cars so ordered and Equipped and Usable as such!

A safety beta feature. Doesn’t sound comforting.

“Tesla should geofence it to supported roadways and conditions.”

Say what?

How are you gonna “geofence” AutoSteer to places where a fire truck might park? That could be literally anywhere on any road!

Autopilot (not AutoSteer) is always on in cars equipped with it, so there’s no point in trying to “geofence” that.

I’m now convinced that we won’t see the Model 3 in Europe until they have significantly improved the software.
It would fail the Euro NCAP test in the driver assist category. The test includes emergency braking for a stationary car and emergency braking for a stationary car protruding into the lane.
I don’t think Tesla wants to make it official that their “tech” car can’t do what everyone else can.

The Ampera-E/Bolt passed the Euro tests with flying colors. Only blemish was no seatbelt fastened alert for rear passengers.

That was the old test. The Model S and X passed that as well. The only EV that has done the new 2018 NCAP according to the website is the Nissan Leaf. It also lost points for the rear seatbelt sensor. But it did passable on the driving assists like brake for stationary cars, crossing pedestrians and bicycles. All of them day and night.

It did not pass with flying colors. It got 4 stars which is basically a fail. And it didn’t score real good in any category.

Only reason it got 4 stars was the previously mentioned lack of rear seatbelt indicators. A very minor issue which can easily be corrected.

And unfortunately for you employer, GM has pulled out of the largest Western auto market in the world.

Largest Western auto market?

Its called Europe.

Correct your statement to Largest Western Auto Market not named USA… 15M vs 17M

Lmao, that will be USA, with 15 million cars and trucks a year. Real always bringing nothing to the conversation

Bolt AEB is slow speed only. It isn’t active AT ALL at 60 mph.

I don’t rely on it, but I have observed my i3 to detect a stationary car at an intersection with the BMW braking successfuly.

Mine has as well. But there are other times where it has failed when I thought it would work. Like when there is a curve in the road and the car is not squarely and directly in front of me. When the car ahead stops the radar isnt able to pick it up as well because it is off center and is reading it as a car on the side of the road until the very last second when you’re close enough but at that point it would be too late.

I even did a test in a drive through line. Wanted to see if it would stop before hitting the guard rail as the drive through line curved sharply around the corner. It didn’t and I had to manually brake.

Your nuanced response is what so many of these idiots don’t get. Just because a system passes a test or demonstrates that it works on a video doesn’t mean it has a 100% success rate.

I’ve come across situations where my car is on cruise control and the car in front of me is stopped many times in my life, probably once every few thousand km. Teslas have driven 6 billion miles, and Autopilot has around half a billion miles under it’s belt. They would come across this situation tens of thousands of times.

So a couple rear enders make the news, and these clowns think they know Tesla’s AEB is crap compared to others, that they couldn’t pass an NCAP test, that a single video of a Merc is proof to them that their system is superior, that BMW is superior because their i3’s AEB worked once, etc.

Not one single person on this forum has the slightest clue about which manufacturers have better or worse AEB than Tesla.

We need more humble and intelligent opinions like yours.

Going 60 mph on local roads with intersections lights is red flags for me. Ill do 55 of its a parkway or 45 if its 4 lane road in no residential area

Why is Autopilot is failing on Stationary objects? Is Pro pilot and Honda Sense and Toyota are as well? Is there a stop and go feature on Autopilot?

Most modern systems, like Cadillac Volvo and even the new Subaru AEB system, would easily have detected the stopped object ahead and braked.

It’s another area where Tesla is 5 years behind the curve.

It’s amazing how many times the same wrong info is being repeated in this discussion.

AEB systems may be able to detect a car slowing and stopping, but not parked. When the car with ABS is moving at low speed, it may even be able to detect parked cars, altho probably not reliably.

No AEB system in any production car, from Cadillac or anybody else, detects parked vehicles when the car is moving at highway speed.

If you disagree, then cite your source.

Check on the NCAP homepage under best in class. They all manage to brake for a stationary object at higher speeds.

No, you don’t know that.

I already know Nissan’s ProPilot can fail to detect on occasion after just 3 months of use. People above know BMW’s system can fail on occasion. You have evidence that other systems are better?

Of course you don’t. You just have a massive ego that drives you to make claims like this.

This is complete non-news. Autopilot is for using on the highway. It’s not meant to be used on city roads where there are traffic lights and such.

Then why does Tesla allow AP to be activated on local roads if it’s only meant for highway use?

So AP can’t even tell if it is actually on a high way? If I can turn it on it is meant to work.

As is only reasonable for any function! If I step on the accelerator, obviously the car should only live if it’s safe. And if a kid tries to start a car, it shouldn’t turn on, because it’s obviously unsafe. Same if I turn the wheel in a dangerous way.

I’m not sure Tesla is sufficiently clear – always, not only in their disclaimers – about the limitations of their tech. But… your “argument” couldn’t be much weaker if you tried.

Well of course an automated lane-following system cannot tell if it’s on a highway or not! Differentiating between “highway” and “non-highway road” is something that is perceived on a human level of understanding of the environment, vastly beyond the capability of robotic “brains” which are, at best, about as smart as an average insect.

Or you do what Cadillac did and 3D map freeways and the system can only be used on those roads.

And yet there are hundreds of YouTube videos of people showing AP use in these unsupported conditions and people believe they can ignore the mfg warnings thanks to the hype and marketing about “neural networks” and advanced AI and machine learning.

> This is complete non-news. Autopilot is for using on the
> highway. It’s not meant to be used on city roads where there
> are traffic lights and such.

This is very much not true. I use my Tesla Autopilot on local roads all the time. It works exceptionally well!

I find it a little disturbing that I had to get to the end of the article before I found out if the driver was ok or not.

Also I think it’s pretty impressive that a Tesla can drive 60 mph into a stationary object without braking and only get a broken ankle out of it.

“not programmed to stop for stationary objects”.

Looks like the Apologist Force is out in full regalia this morning.

Bad news– the rest of the world thinks this super-smart car with the super-smart CEO should be able to
“see” stuff like red lights and large trucks stopped in its path.

Some people have their reasons for crazy driving! Maybe this drivers fail was he survived?

Tom Dually said:

“[quote] not programmed to stop for stationary objects. [unquote]

Looks like the Apologist Force is out in full regalia this morning.”

Gee, I wasn’t aware that making a factual statement made one an “apologist”. 🙄

Maybe the real story here should be just how many people don’t understand that Tesla Autopilot+AutoSteer is merely an automated lane-following system, and isn’t even remotely close to being able to actually replace a human driver.

Just like an airplane’s autopilot can’t be expected to take off or land a plane, and isn’t to be relied upon in an emergency situation.

Currently in the US there are 150 deaths for every 10 billion miles this is the information missing in the news. Although Tesla drivers have been involved in some accidents that have resulted in deaths the Tesla vehicles are still so much safer than other vehicles on the road and although they will never reach zero accidents they will continue working on improvements.

I believe we are up to three total deaths involving AP which would mean Tesla would need more than 200 million AP miles to be safer. Per Elon Tesla surpassed that mark in 2016.

The last time I heard Hiw many miles Tesla vehicles have traveled it was 7.5 billion miles. Yet vehicles on the road today have 150 deaths for every 10 billion miles traveled. So tell me which vehicle is safer.

You’re correct, but how many of those accidents involve vehicles put into service after 2012. That’s the real data point to see if Tesla’s are safer compared to like cars. Sure a 80’s, 90’s, early 2000’s car is less safe.

You need to know if Autopilot or at least Tesla intelligent Cruse Control was engaged or not. Totally different story if not. I drive a beloved Tesla Model X 100D since last Sept and use the Cruse control almost all the time, and it never failed to timely and smoothly decelerate and stop when such a vehicle was stopped ahead of it. Sounds the driver had neither engaged, and was not driving in any way he should have had. Why making such headlines without decent facts ??? This is just not cool.

Plus, what model year was this Tesla Model S? Was AP1 or AP2 equipment installed on it? Did anyone even think to ask that Question? The S 60, S 85, and S P85, all were out, BEFORE Such Equipment was installed! Which version, if any, would this car have had?

The word “Autopilot” should not be in this discussion at all! Nor AutoSteer, which is the real question, since you can’t turn Autopilot off.

We don’t know whether or not the car had AutoSteer engaged, so 90% or more of the comments here are entirely off-topic.

Apparently Teslas cannot see red…

Nor signs, which is something Waymo can do.

Or, even license plates:
TESLA AI Algorithm “oooh, cool, I spy with my Musky Eye a license plate. Ooooh, fun, it’s getting bigger. And even bigger…. Wow, that thing’s YUUGGGE! ouch!”

“Nor [traffic] signs, which is something Waymo can do.”

You’re making an apples-to-oranges comparison.

It’s true that the semi-self driving system that Tesla puts in its production cars isn’t as advanced as what Waymo puts in its test cars, but we know that Tesla has test cars with more advanced self-driving systems.

However, on the plus side, Autopilot+AutoSteer is quite reliable at reading and following speed limit signs.

IMO, For AV to really succeed, we need vehicle-to-vehicle (V2V) communication and Vehicle-to-infrastructure communication. If the fire truck had been at an accident scene or fire, it could be broadcasting to the network to rerout traffic or to slow the fvck down. If the traffic lights had coms, a connected car would know to stop and not be reliant on 20something programmers playing sims with human lives.

Well said!

We can expect future vehicle-to-vehicle communications designed to give priority to emergency vehicles, and to allow them to broadcast commands to nearby self-driving vehicles to slow down and/or stop and/or steer around them.

“programmed not to stop for stationary objects”

Sounds like bull, sorry. In my experiments with EAP on the expressways here in San Jose I have found an effect I’ll call “wet pants syndrome”, where the approach velocity is high and I hit the brakes because of that. The rule is simple: A good driver slows down before reaching stopped or slow traffic.

I highly suspect that the real answer is that it rejects high closing velocities in the adaptive cruise control algorithm. It isn’t, after all, an emergency braking system.

There is at least one manufacturer advertising EAB. I suspect Tesla is thinking about it, but it opens up a new can of worms. A rear ender caused by a false EAB would certainly end up being Tesla’s fault.

Amazing how many people are rejecting reality here.

AEB systems, including Tesla’s, depend on Doppler radar for detecting moving objects, and only moving objects. With the current state of the art of AEB systems, there is no attempt to locate the far more numerous stationary objects around the car.

“Doppler radars operate by transmitting a fixed frequency and looking for a change in that frequency caused by a moving object. If the object is not moving there will be no frequency change, and the object will not be detected.


Yes, so Doppler radar didn’t detect a MOVING object because this object was stationary. But the car was moving. But the firetruck was stopped.

Congratulations, you have just repealed the theory of relativity. Time for a refund on that college degree, dude.

Why is auobrakning not on always?
Why should it be possible to inhibit it?

As not all Tesla’s Ever, always had some AEB or AP equipment installed, we first need to consider that, then, the Driver: What the Fvck was he doing driving at 60 Mph on streets with Traffic Lights, and not paying Attention, in the First Place!!!

Doesn’t the Tesla come with AEB like Mercedes?

Interestingly if you google “vehicle runs into firetruck” and look at all the non-Tesla crashes, the make and model of the vehicle is never mentioned (in any of the half dozen I randomly looked at). Only Tesla gets mentioned by name! And there are dozens of stories on every Tesla crash. The others are typically only local newspapers or TV stations.

The Driver was Careless & Not Paying attention When the Tesla Saw Red!…lmao…………… Wake Up & Drive !

> It comes as no surprise that a Tesla Model S ran into another parked fire
> truck since the vehicles are not programmed to stop for stationary objects.

No idea what on earth you are talking about. Tesla autopilot stops when there are cars standing still in the car’s lane. I use this all the time. When I approach a red light and there are cars in front, I don’t touch the brakes. Autopilot stops the car on its own. I have noticed that it has trouble identifying oddly shaped vehicles, such as lanscaping trailers, etc.

Maybe it is not very reliable. So stuff like ramming big red truck can happen..

No, your Tesla car detects the car in front of it slowing to a stop. That’s a velocity change which the Tesla car’s AEB system it is designed to detect using Doppler radar, and Autopilot is designed to respond to.

Your Tesla car can not detect a vehicle parked in its lane. Parked vehicles do not change velocity, and just like any other stationary object, are not detected by the Doppler radar.

It’s amazing (and depressing) how many people fail to understand this. If you are trusting your life to a semi-self-driving system, then you really ought to learn what its limitations are!

Why are Tesla’s accident prevention systems not working was it all hype? Automatic breaking should be able to see anything slowly moving or stationary especially given how many cameras, and sensors Teslas have. Of course the systems can’t overcome physics, slowing a car’s mass down will take some distance, but no braking at all is really bad. And as for the driver, if they were playing with their phone, clearly we need a strict ban on phone use behind the wheel. People rarely have enough self control to realize that they don’t have to immediately answer calls, or reply to messages. To me, especially being a car enthusiast, nothing is more important then driving when I’m behind the wheel. I take calls via hands free Bluetooth only, no texts, no internet, no posting.

Next you are going to tell me that making Margaritas using my Tesblenda in the front seat is a no no. Geez…

“Automatic breaking should be able to see anything slowly moving or stationary especially given how many cameras, and sensors Teslas have.”

You — and, sadly, apparently about 90% of the people commenting here — have a vastly inflated idea of just how much a Tesla car with Autopilot+Autosteer can “see”.

Perhaps this will help you understand how very little info a Tesla car gets from its radars:


As for all those cameras: Those are nice for seeing and “reading” very specific types of objects such as speed limit signs, but the software used to recognize objects from video camera images is much too limited, too unreliable, and too slow to identify and place every object in the environment, or even most of them.

All I’m reading here is that a Tesla Model S crashed into a massive steel object at 60 mph and the driver came out with a broken ankle… Please let me drive a car like that!

Most modern, full size cars will let you walk away with just a broken ankle…or less.

Come on insideevs!! Either report all plug in wrecks by manufacture or none!!

Tesla engineers wanted to add extra sensors to track eye movement of drivers and/or more sensitive sensors in the steering wheel. Elon nixed both suggestions, saying it would cut into profit margins too much and/or be too “annoying” to owners.

I bet people such as Joshua Brown and Walter Huang would have appreciated Tesla being more “annoying” or willing to sacrifice a few dollars of profit to increase driver safety. Sad! Profit margins >>> owner safety. Nothing new there.
“It came down to cost, and Elon was confident we wouldn’t need it,” one of those people said. Executives conveyed there was pressure for each vehicle to reach a certain profit margin, according to the people familiar with the matter.

You have to grip the wheel fairly firmly, and only on the top, not bottom, although it will allow single handed driving (determined by empirics). The follow your eye thing is just a joke, please. Why not hook the driver up to a lie detector while you are at it.

GM doesn’t offer SuperCruise in the Bolt EV – for a similar reason. Vehicle cost control. The Bolt doesn’t even offer ACC as an option.

Last time I checked the Model S was not a cheap car.


Your post reads like a Trump tweet – except without the spelling errors. 😊

If MadBro posts something about Tesla, we can be sure it’s B.S. No doubt about that!

It would be stupid to put sensors on the steering wheel to ensure someone kept his hands on the wheel at all times. The whole point of AutoSteer is to be able to ride along without having to steer. Tesla has released a video of one of their advanced self-driving test cars where the “driver” has his hands in his lap for the entire drive, while apparently remaining attentive. Presumably that’s what Tesla is working toward for AutoSteer.

I do hope Tesla will add an eye-tracking system to make sure drivers pay attention to the road. I rather suspect that before long, all semi-self-driving cars will have that.

Manufacturers using Pilot to describe their ‘autonomous’ & assisted driving
“Piloted Driving” (Audi)
“Speed Limit Pilot sub-function’ (Mercedes-Benz)
Pro-pilot ……..(Nissan)
“Pilot Assist” …(Volvo)
Auto-Pilot …….(GM/General Motors years ago)
Auto-Pilot …….(Chrysler/Imperial years ago)
Auto-Pilot …….(Tesla currently)
“Piloted Drive” ..(Nissan)
“Co-Pilot” …….(BMW)
Co-Pilot360 ……(Ford)

Too many ‘devil may care’ drivers out there.

What is astonishing is that anyone thinks the car can “see” and “understand” the world around it, if it systematically fails the simplest test – a huge, stationary object. I would think owners would test the system a little bit, while paying attention acutely and hovering a foot above the brake pedal, but apparently they don’t.

As far as I can tell, AP works mainly based on GPS, high-resolution but potentially outdated maps (when something has changed, it’s no good that the map i was up to date an hour ago), and recognition of lane markings.

But this sort of accident really is more funny than anything else when there are no injuries except to the owner’s wallet.

Sorry, I have seen no evidence that the EAP uses GPS at all (at least on M3). It follows lane markings, as evidenced by the fact that it will in fact cross to the wrong side of a lane if you, say, follow a curved road that has a left turn lane. The speed adaption is done by judging distance to the car in front, and in fact works without autosteer engaged at all. There was a lot of discussion in Tesla boards about using the GPS to move over and take exits based on the GPS, but I have tried that a few times, and (at least on M3), it makes no difference whatever if an exit the GPS is telling you about is coming up. My standard (so far) with the EAP is to keep at least a hand on the wheel (required) and a foot over the brake (strangely, not required). The foot position is actually really tiring, and I suspect that is part of the problem. Again, there has been a lot of BS in this thread (not you, the whole thing). I believe the adaptive cruise control excludes objects based on closing velocity, since I… Read more »

At least when the Tesla catches fire the fire truck won’t be far away!

(This is not an attempt to make a statement about Tesla or AP. It’s called a joke.)

The car that just hit us is on fire! Lets put it out! Bummer that it broke the water pump, and in any case we aren’t hooked to a hydrant and cannot move the truck…

No, it’s not a joke. It’s a “concern troll” post pretending to be a joke.

The GM/Cadillac SuperCruise systems doesn’t get the love or attention the Tesla AP does. But, it has a solid system to verify that the driver is paying attention. And, it works great on pre-surveyed highways.

Tesla AP sounds cool because it should work anywhere without pre-surveying, but the truth is that it does not.

Right, and the manual very clearly states it should not be used in situations where this person engaged it. But their is so much fan created fiction and marketing about advanced AI and neural networks, and self driving, people don’t understand the risks they are taking.

The difference is Supercruise well not let you engage it in areas outside of where its been designed for.

At least it didn’t catch on fire this time… If it did, at least the fire truck is already on the scene… LOL.

(dodging hot lava rocks throwing by Tesla fan bois)….

Where’s that Boring Co. flamethrower when we need it? 😉

So under what parameters does the automatic emergency braking work? Is there a speed threshold above which that it doesn’t work? Does it always excluded non-moving objects?

Emergency braking only works, when the Autopilot sensor doesn’t get scrambled by the indiscriminately random spots of the Dalmatian, riding in the back of the Fire Truck!

Not sure about the Tesla system
but AEB, as I understand it is designed only to brake when it is clearly going to crash. It is not meant to avoid a crash, but to reduce the severity.

Well, I learned something here. I thought Automatic Emergency braking would try and stop a car in almost all cases, but I guess it makes sense that it would not in some cases because if it’s wrong more than it’s right, the results might be the car getting rear ended after a false stop. I have not researched how these systems work. Autopilot may not have been engaged here, but I really think Autopilot is a bad name that should never been used.

I wonder how common this exact type of accident (i.e., a personal vehicle rear ending a parked fire truck) is among all vehicles. Seems like it would not be that common.

Autopilot was engaged… in this accident News just broke, along with another Tesla fatality with fire in Switzerland… 🙁

The guy was from Tettnang, burned in Tessin in a Tesla

That’s not a 60 mph impact. 45, maybe 50.

A thoroughfare that introduces traffic lights into a stream of traffic is the real problem here. Or were they speeding?

You got to be kidding:

“ since the vehicles are not programmed to stop for stationary objects.”

Isn’t that exactly what you want from a self driven car? Stop for stationary objects, rather than slamming into them?

The car is moving relative to the stationary object, there is relative movement, so it should’ve easily detectable.

The larger story is whether owners are abusing or misusing the hardware. It’s been widely acknowledged that AEB and ACC have trouble identifying the taillights of uncommon trucks. The truck wasn’t on the side of the road. It came to a stop at a light.

“It’s been widely acknowledged that AEB and ACC have trouble identifying the taillights of uncommon trucks.”

Then it is ABSOLUTELY A FALSE ADVERTISEMENT” to say that it is a high speed AEB system.

Fire truck likely saved lives being there. Otherwise the Tesla driver would have blown the red light at 60 mph and likely caused a multiple car collision with cross traffic.

Driver. Was a fool.

Utah has stoplights on roads posted for 60 mph?

Probably posted 55. Lots of those roads here in co, 55 or 65 with stop lights. The road right in front of my work is 55, 6 lanes with stop lights. Traffic often goes 10+ over the speed limit.

It’s pretty easy to discover via Google maps that the posted speed limit is 60.

Very lucky she didn’t go to jail forever for vehicular manslaughter if a firefighter was at the rear there. Losing her Tesla seems like a good start.

No other car maker would be so careless as to name their driver assistant Autopilot. People think it will drive their car like autopilot flys the planes they travel in. They deserve to be sued back to the stone age. A jury of common people would call them liers.

Another deadly Tesla accident just reported in Switzerland. Speeding driver hit middle divider, car flipped over and burst into flames:


Tesla needs to be very careful. If they can tell that a driver doesn’t have their hands on the wheel for over a minute, but doesn’t take any actions to get their hands back on the wheel, they will have some lawsuits on their hands. I know they have some warnings, etc. but they need to make those far more frequent if they truly want people to keep their hands on the wheel. The fact they can say she had her hands off the wheel for 80 seconds, driving at 60mph+ but they have no functionality to prevent that, seems like a litigious situation. No reason to have your hands off the wheel more than 10 seconds, and warnings should be very aggressive from about 10 seconds to 15 seconds, no more than that.

+1. I can’t imagine it would be that difficult for the system to just refuse to engage if such actions (or lack thereof) were repeated. If the system continues to ask for driver intervention and the driver doesn’t intervene or tries to “beat the system” by jerking the wheel and going hands-free repeatedly, Autopilot should then timeout.

On the screen appears, “You get nothing, you lose, good day sir mam.”

Should be more like Super Cruise, which if it detects that your eyes are not forward and paying attention, after a few warnings will pull the car over to the side of the road and flash the hazards.

M3 Owned- Spark Leased - Niro EV TBD

Stupid human tricks. -SMH.