Tesla Driver Rear Ends Fire Truck: Blames Autopilot, Suspected Of DUI

Tesla Autopilot crash


The crash resulted in two injuries. No firefighters were hurt.

As is often the case in these accidents, the Tesla driver is claiming autopilot is to blame.

This is the now the third accident involving a fire truck and a Tesla this year alone. Back in January, a Tesla Model S hit a fire truck in Culver City, followed by a similar incident in San Jose just a few months later. This week, a Tesla Model S crashed into yet another fire truck in San Jose, leaving two injured and a driver in custody on alleged DUI charges. All accidents where a Tesla hit a fire truck – including this one – were reportedly purported to have occurred while the vehicle was on autopilot. Remember, however, automatic braking systems are not designed to stop for stationary objects. According to Tesla:

Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.

This particular accident happened around 1 AM Saturday morning on Southbound Highway 101 near Coyote Creek. San Jose Firefighters were responding to an accident on the scene when a black Tesla rear-ended one of the fire trucks. The crash resulted in the two passengers that were in the car to be taken to the hospital. Currently, we haven’t received an update on their condition. On the other hand, the firefighters were fortunately left uninjured from the incident.

The firefighters took to Twitter to vocalize their concerns about a Tesla rear-ending one of their fire engines again:

Even though the driver claimed that he “thought the Model S was on Autopilot,” he was arrested later on under suspicion of drunk driving and taken into custody. According to a now-deleted article by NBC, the police report makes note that Tran (the Tesla driver) blamed Autopilot when talking to the police officers.

“Tran told officers “I think I had auto-pilot on,” but it was unclear whether the Tesla was in self-driving mode when it crashed into the firetruck. The collision sent Tran and his passenger, 26-year-old Yorleyda Londono of Monterey, to San Jose Region Medical Center with minor injuries.”

When Electrek contacted Tesla about the accident, and the company responded with the following statement:

“Tesla has not yet received any data from the car, but we are working to establish the facts of the incident.”

A lot remains unknown about the incident in question. Furthermore, the DUI charge may lead the investigation in a completely opposite direction than the Autopilot claims made by the driver.

Source: Electrek | ABC

Categories: Crashed EVs, Tesla

Tags: ,

Leave a Reply

69 Comments on "Tesla Driver Rear Ends Fire Truck: Blames Autopilot, Suspected Of DUI"

newest oldest most voted

I wonder how many other cars with automated lane-keeping (such as Cadillac’s Super Cruise) also rear-end stationary emergency responder vehicles, but no reporter bothers to report it as “news” because it’s not a Tesla car?

I really wish that InsideEVs would — at least once! — run an article about the real reason that cars with automatic braking systems (Teslae or not) run into stationary obstacles… such as parked fire trucks. (In fact, I wish IEVs would run such articles regularly. But at least once would be nice!)

Gentle reader, you can read the truth behind this subject at Ars Technica:

“Why emergency braking systems sometimes hit parked cars and lane dividers”


We did run that. We’ve talked about it on multiple occasions. The systems are not designed to stop for stationary objects.



“The systems are not designed to stop for stationary objects.”

Hmmm. Maybe Tesla’s system isn’t designed to stop, but some certainly are: https://m.youtube.com/watch?v=txG5b5h90zE

Yes, there are certainly exceptions, especially for vehicles like the one in the video. But, even Volvo’s systems, which are supposedly some of the best in the business, offer almost the same type of disclaimer as Autopilot.

Interesting. I heard way back (10 years ago?) of a luxury car company that would have their salesmen drive a model right at the wall of the dealership to show it would stop automatically. But that’s anecdotal and I don’t have a reference. Made me think the problem was fairly solvable. A car should seemingly be able to identify stationary objects when it is in the lane you’re in.

The right system shouldn’t have a problem distinguishing stationary objects in your lane from other stationary objects. Disclaimer: I’ve been an engineer working on defense radars for over a decade and am fairly familiar with their tech. Maybe cost causes the car radars to be a bit over simplistic.

What about LIDAR? I’d like to see some independent testing comparing the results of each technology and combinations of them.

Or instead of a complete LIDAR system, how about just 2 laser sensors up front? They cost $200 ea. If you see data coming back from the sensor showing an imminent crash, hit the brakes.

Yeah, it seems like a laser approach should be pretty easy. You can get distance finders at the store on the cheap. Guessing it’s probably the logistics of hills and all that fun stuff. Still, I would think this is an easily solved problem, especially since some vehicles have the capability.

I also wonder about Volvo, for example. Is their disclaimer just a legal catch all, or can they truly not handle this type of situation either?

Solid state lidar systems have dropped drastically in price. Some prototype self-driving cars use a series of 5 solid state radars to provide 360° coverage around the car. I expect to see this type of system in production cars before long.

I expect that even Tesla will give in and start using either a series of solid state lidar detectors, or else high-res radar arrays. Currently Tesla’s ABS depends on low-res Doppler radar, just like other auto makers. That’s cheap, but it ain’t adequate for fully self-driving cars… nor is it adequate for detecting fire trucks parked in a traffic lane on the highway!

But the sensors aren’t the only limitation. The self-driving car’s software needs to be able to create a SLAM; a virtual 3D “map” of the area around the car, made in real time. No auto maker is putting such a system into production cars yet. Neither the hardware nor the software are up to the job.

“I heard way back (10 years ago?) of a luxury car company that would have their salesmen drive a model right at the wall of the dealership to show it would stop automatically.”

I would guess that’s anecdotal, because I seriously doubt that’s dependable enough for a dealership to risk wrecking an expensive car. But for the sake of argument let’s say it’s not.

From what I’ve read, some ABS systems will at least sometimes stop for large stationary obstacles when traveling at low speed. The problem here is that they start ignoring such obstacles when traveling faster than (from what I’ve read) about 35 MPH, because of the forementioned “false positive” problem.

I won’t pretend to understand just why such systems wouldn’t have the same problem at low speeds, but perhaps it’s due to the lower reaction time involved when the car is traveling at highway speed. Computers can in theory react instantly, but when running a complex computer program with (I presume) millions of lines of code, even a computer-controlled machine may take some time to react properly.


Some manufacturers figures it out. Systems that are not designed to stop for stationary objects will never get highest safety rating in Euro NCAP.

Subaru Eyesight does appear to stop for stationary objects in Youtube videos.

At what speed? At highway speed, as happens for all these reports about cars running into parked fire trucks? Or is the detection limited to low speeds?

Several auto makers’ ABS systems work (at least sometimes) for detecting stationary obstacles at low speed. That won’t solve the problem reported here.

Thank you for the reply, Steven.

The first linked article there does delve a bit into the technological limitations involved, altho not nearly as deeply as a few other articles have; articles such as the Ars Technica article I linked to. It also mentions that it’s ABS (automatic braking systems) in general which have this fault, not just Tesla’s.

The second of your linked articles doesn’t go into the actual cause at all. Rather, it incorrectly suggests the problem is a car which is “suddenly” revealed when an intervening car “gets out of the way”.

However, Steven, IEVs has run many articles on the subject of Tesla cars running into stationary obstacles, without ever mentioning just why this keeps happening. May I suggest, in what I hope is a polite fashion, that IEVs makes a policy for future articles on the subject to at least briefly mention the actual cause, with a link to an article which goes into more detail. Something like “In general, because of the ‘false positive’ detection problem, ABS systems (including Tesla’s) are not designed to react to stationary obstacles when the car is traveling at highway speed. Stationary obstacles including vehicles parked in a lane of traffic.”

My original article about the first crash carried the subhead, which also said that these vehicles are not programmed to stop for stationary objects. The other that I wrote compared Volvo’s statement saying the same things and talked about how it’s not just Tesla, but most cars. We continue to mention why it happens in our many articles about Tesla vehicles running into stationary objects and will be diligent to do so in the future. “It comes as no surprise that a Tesla Model S ran into another stopped fire truck since the vehicles are not programmed to stop for stationary objects.” “It’s not just Tesla Autopilot that fails to see stationary objects.” According to Wired, Volvo’s Pilot Assist system is much the same. The vehicles’ manual explains that not only will the car fail to brake for a sudden stationary object, it may actually race toward it to regain its set speed: “Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed. The driver must then intervene and apply the brakes.” “It’s not just Tesla Autopilot that fails to see stationary objects. In fact, automaker’s systems are programmed this way on purpose.” https://insideevs.com/tesla-model-s-rear-ends-another-parked-fire-truck/ It’s something… Read more »

Did you actually read the link I posted?

Your first linked article does go into the actual problem a bit, altho certainly not deeply enough for someone to have a good grasp of the limitations involved. Your second linked article doesn’t address the real problem at all; it’s just a distraction from the actual problem.

I added the link to the first paragraph of the post and also included the quote.

There’s a good chance Cadillac’s Super Cruise wouldn’t allow this because you have to keep your eyes on the road.

The Cadillac CT6 Supercruise or not did not run into parked objects in testing, but in the real world, who knows what will happen… 2 lessons here, don’t drink and drive… and if using automation do not count on it too much.

“The Cadillac CT6 Supercruise or not did not run into parked objects in testing…”

At what speed? Various auto makers have installed ABS systems which will sometimes brake for large solid obstacles (like brick walls) at low speed. But that detection gets shut off above ~35 MPH. That system wouldn’t help at all with cars — Cadillac, Tesla, or anybody else’s — running into fire trucks parked on the highway.

So far I have not heard of any Supercruise accident, but Surpercruise is a much smaller sample size.

@David Green
A map-based system like SuperCruise might stop the car before hitting a fire truck stopped on a highway, since it should eliminate most all false positives that non-map-based systems would encounter. Since it relies on maps, Supercruise would know the car’s path ahead of time, and know there shouldn’t be a stationary object in the lane ahead, or know the tree straight ahead is not in the car’s path because it knows there is a curve in the road ahead before the tree.

You’re suggesting that Cadillac Super Cruise is a fully developed Level 4/5 autonomous driving system.

We know that’s not true. In fact, Cadillac disables Super Cruise anywhere except on limited access highways… unlike Tesla’s AutoSteer.

I find it funny you would imply that a site which runs Evannex articles has any anti-Tesla slant.

heck this article read to me like the driver being under the influence should somehow take autopilot or tesla off the hook.

I see no such implication.

I don’t see that TheFlew implied any such thing. Perhaps the problem is in your inference, not his implication.

theflew who? my post was directed @you.

The reason for this Tesla to rear-end the fire truck was that the driver wasn’t paying attention. So, this is not the car’s fault.

If I understood the article correctly, he was suspected of DUI and blamed the “Autopilot” for the crash. I am 120% sure Tesla tells the “Autopilot” users that they can take their hands off the controls, but must pay attention and be ready to engage in driving, when the “Autopilot” indicates a hazardous situation.

Yes. Always blame AutoPilot!


The reason is simple- the driver wasn’t paying attention. Has nothing to do with the cars systems.

It’s strange to me that my Model S will brake to a full stop when it’s in AP if the car in front of it is moving, but somehow can’t handle stopping (or even slowing) when it approaches a completely stopped vehicle? I was driving the other day in AP going 50 mph and rolled up on a stopped vehicle and had to take it out of AP and manually brake. It doesn’t make much sense to me, if anyone else out there can shed some light I’d appreciate it.

Thanks Steven!

Anytime, John.

It’s due to how most radar based ACC works. It is typically set to ignore stationary objects (as those can be a parked car, road sign, or other stationary objects on the side of the road that should be ignored). This is to prevent annoying false positives where the system unnecessarily slows or brakes when it shouldn’t. Basically it expects driver to be alert and always be ready to brake if necessary.

You have to think of ACC as a system that follows and tracks a target and tries to maintain a certain distance from that target. It is not a system that brakes for everything.

That makes a lot of sense. The fact that it ignores stationary objects on the side of the road is mere feet from an object directly in front of it, and from distance it would make sense that AP can’t tell the difference in time to react. Thanks!

But why can’t it determine a stationary object in its path is a threat? Seems like a really basic thing to get down in order to achieve even partial autonomy.

(⌐■_■) Trollnonymous

Same reason the UBER car did stop for the moving pedestrian running/walking their bike across the street…

They’re not ready for prime time. AP is not worth the money.

This is how adaptive cruise control works. Same for my Skoda. It detects vehicles that slow down or stop, but not parked ones. Only on the last meters, it would trigger emergency brake when it detects the driver braking.

Exactly! Thank you. It gets a bit tiresome seeing all these articles and comments which state or at least imply that it’s only the ABS system in Tesla cars which fail to react to fire trucks parked on the highway.

The reason the same thing isn’t being reported for non-Tesla cars isn’t because other cars will brake for parked vehicles. The reason the same thing isn’t reported is because such accidents are too commonplace to be reported as news unless Tesla cars are involved.

Precisely, and this is what we keep saying over and over. These systems don’t detect stationary objects by design. Just like most other automakers’ systems, including Volvo, which is supposedly one of the best. I can’t find an article on our website about this type of situation that implies it’s only a Tesla issue. If you find one, let me know and I can update it.

There are at least two things which might be the cause of that:

1. The difference between an ABS system (sometimes) detecting large stationary obstacles when the car is moving at 35 MPH or slower, but disabling that detection at higher speeds.

2. If AP detects a car in front of you slowing to a stop, then that’s a change in speed which its Doppler radar will detect. (Differences or changes in speed is the only thing Doppler radar detects.) Stationary obstacles don’t change speed, and therefore are not detected by Doppler radar.

It’s kind of iffy relying on ACC In my i3 to stop for stationary cars at an intersection. Sometimes it reacts, sometime’s it doesnt.Here in LA, the car in front of you will often move to the left, suddenly exposing a parked car on the right curb. Calling these systems autopilot might result in deadly results.

Might? I think it has. Repeatedly. I mean this system is so good, a Tesla will drive itself across the country in a few months. Right? Maybe Tesla should somehow tell it’s dumbass AI what a fire truck looks like and to at least not hit those. Jeez… as if being a first responder wasn’t hazardous enough! Now they have to worry about the random Tesla missile being fired at them!

I thought the Tesla fleet was gathering data from driving on AutoPilot and learning from that data how to make AutoPilot better/safer. I guess that was just Tesla/Elon marketing bull💩.

Maybe bull crap, I don’t know, but I do know the Autopilot is a very, very slow learner. I’ll help- Fire truck bad! Very, very bad! Don’t hit fire truck! Steer away from the fire truck…

The fleet *is* learning. Trust me, no Tesla will ever hit that particular fire truck in that particular spot again!

If certain Tesla or other car owners who make use of advanced cruise control systems can’t distinguish between marketing wisecracks the real capabilities of the “Autopilot”, the joke is on them – and it’s not funny when people get hurt.

Fasle advertising is never good.

The article mentions three incidents this year in CA. There was also one in Utah. It was a fire/rescue truck, but I think it was just sitting at a stoplight and not stopped to provide emergency services.

Depending on radar solely is a big mistake. Radar has low resolution and cannot distinguish a small (piece of tinfoil) “bright” object from a large, not very reflective object like wood. In the case of Firetrucks, they are very boxy and when parked on an angle, the radar waves are deflected off to the side and not straight back to the radar sensors.

Solution is to add capability by additional sensors, and the most common one cited is LIDAR. However I believe binocular vision via cameras would be significantly superior as cameras have significantly higher resolution than LIDAR and with pairs of cameras, triangulation to matched objects can be used to determine their distance.

Again, you are overstating the value of binocular vision. With a single moving camera, depth perception can be done just fine by simply comparing different frames. It’s a pure software problem.

We will never agree.

Then you need to adopt a more open mind on the subject, because he’s right.

They’ve got 8 cameras, you really think they can’t do binocular?

I don’t care how binocular or even octo-nocular the camera system is. Multiple cameras don’t help when the limitation is in the lack of reliability of optical object recognition software which processes those camera images.

Remember the very first fatal accident under control of Autopilot? Tesla said Autopilot “confused the side of a semi trailer painted white with a brightly lit sky”. Do you think 8 cameras is going to fix that problem?

Just one rotating scanning lidar detector (or 5 fixed solid state lidar scanners) would do what 8 or 18 or 80 cameras can’t do: Provide active, real-time distance-to-target data for objects around the car.

Spot on PP! For 3D accuracy LIDAR wins.

You say it yourself: the problem is in the software. The fix is improving the software. Adding complex (and actually quite problematic) extra sensors, that only simplify the software side, without adding much value beyond that, is basically a cop-out.

Camera images simply are not adequate for the task. Software processing of visual images from cameras is not reliable. It’s a very complex problem. We humans have a highly developed visual cortex in our brains which is the product of billions of years of evolution. Cameras and computers have nothing even remotely equivalent.

What dependable self-driving cars need is active, real-time scanning of the environment around the car. That requires active scanning with lidar, or high-res radar, or both. Cameras, even using “binocular vision”, are not the solution, despite what Elon Musk keeps claiming.

What makes you an expert on this?

Last time I heard, well-trained neural nets can do certain image recognition tasks *better* than humans already. Getting to that point with the sort of moving images involved here is clearly still a challenge — but very likely not an impossible one.

I think the only problem was for Tesla to call it Autopilot and then market it as something revolutionary. Stupid people will buy Teslas and do stupid things.

Interesting that the original tweet, which had a pronounced anti-Tesla slant, has been subsequently deleted and replaced with a more neutral one… Apparently I’m not the only one who thought the original tweet was not befitting a public agency.

When you keep having cars of the same brand plowing into you, I might have a slightly negative view of that brand as well.

You mean, when the news media keeps reporting only cars of a certain brand plowing into parked fire trucks, and ignoring when the same thing happens to cars of other brands, then one might develop a negative view of that brand.

(⌐■_■) Trollnonymous

The ignition deaths didn’t seem to bother you much from the GM brand…

It sounds like the fire departments should paint a target on the back of their trucks to score the impact.

(⌐■_■) Trollnonymous

My Tesla, when purchased, will never have this AP problem because there’s no way I will pay for AP.

Two drunks in the car:
1: That’s the 3rd red light in a row you didn’t stop for.
2: Oh, I thought you were driving.

We tried EAP (two week trial) this weekend in L.A. On several occasions, it slammed on the brakes because it thought a freeway overpass was a firetruck……. 😉

(Seriously, approaching overpasses at an up or down incline seems to confuse the hell out of it).