Vehicle Logs Confirm Tesla Autopilot Prevented Possible Pedestrian Death


Tesla Confirms Autopilot Prevented Crash

Tesla Confirms Autopilot Prevented Crash

Autopilot Demo

Autopilot Demo

Tesla’s Autopilot system has been in the spotlight as of late, mostly due to a recent fatal crash, but this is the side of the Autopilot story that doesn’t often make the news.

Tesla says that vehicle data logs confirm that Tesla Autopilot prevented possible pedestrian death.

The story, as told by a Tesla owner, is contained in the email above. Upon receiving the email, Tesla examined vehicle data logs and found that Autopilot did indeed prevent the crash that could’ve killed a pedestrian.

We don’t expect major media outlets to pick up this story, but it’s the untold side that helps paint the whole picture.

Yes, there’s been a few Autopilot crashes, but the system has prevented far more accidents than it has caused, at least according to Tesla.

Tesla has data on tens of millions of miles driven on Autopilot, yet only a handful of Autopilot-related crashes have been reported (most later determined to have occurred when the car was not on Autopilot), so we think it’s safe to assume that if Autopilot was responsible for more crashes than it prevented, Tesla would wisely disable the system. Tesla refuses to disable the system, so to us that implies that it’s working at least as well as expected.

Now if only the general media would pick up both sides of the story. Is that too much to ask?

Categories: Tesla


Leave a Reply

60 Comments on "Vehicle Logs Confirm Tesla Autopilot Prevented Possible Pedestrian Death"

newest oldest most voted

Will this be on FOX news tonight?

Keep dreaming. No chance anything positive about any EV will show up on Fox

Yes, with the headline “Tesla Autopilot nearly hits pedestrian!!!”


No, nor will the 330k recall due to failure of a wiring design, which can cause airbag deployment to fail. Already attributing deaths to this failure, but these deaths don’t matter to the media, nor will the Model S accident prevention make the news.

Is it too much to ask that the media report these events accurately?

This save was brought to us courtesy of Automated Emergency Braking with Pedestrian Detection.

This is a common technology standard or available on many cars. It comes standard on the Model S as an Autopilot Safety feature. AEB is a life saving technology. No doubt. No one is disputing that.

The media controversy is the result of Autosteer, an optional Autopilot Convenience feature, which is an excellent lane centering system, but that’s all it is. It does not a self-driving car make, but people are using it that way.

Exactly. Automatic emergency braking has nothing to do with Autopilot. It predates Autopilot, is standard on many cars, and, unlike Autopilot, can’t be explicitly turned on for Tesla models. Furthermore, Tesla’s implementation is not the best out there. BMW and Subaru have better versions.

It’s strange to see Elon so factually incorrect, Autopilot had nothing to do with this and he knows it. Unless you are really generous and consider the AEB as a subsystem of the Autopilot.
I guess they are really rathled by the Autopilot incidents, claimed incidents and media cover.

Really? Maybe he doesn’t care because he is desperate to quiet down his many critics of the auto steer feature which has lulled many Tesla customers into complacency and into misusing the feature due to Elon’s blustering and overstating of it’s capabilities.

Tesla breaks Autopilot into two sections: Safety and Convenience. AEB is considered part of Tesla’s ‘Autopilot Safety Features’. So yes, from a Tesla standpoint, it is still Autopilot that prevented the accident.

So, when Tesla’s Automated Emergency Braking fails, a Model S runs under a semi trailer and kills the driver, then you say “Well, that just shows Autopilot is too dangerous to use.”

But when the AEB slams on the brakes and saves a pedestrian from a serious and possibly fatal accident, you say “Well, that was the AEB and not Autopilot.”

So, is it just that you’re quite confused, or are you deliberately using a bait-and-switch argument to bash Tesla?

Reality check: All Tesla’s semi-autonomous safety features are considered part of the “Autopilot” package, whether or not they have to be enabled by the driver. That includes AEB.



Anybody who doesn’t +1 your post is crazy.

I never conflate AEB with Autopilot. Tesla owners get AEB whether they pay extra or not. It’s enabled whether Autopilot is on or not. AEB is a standard feature that many other high end cars without any Autopilot-like features have.

Most importantly, AEB does not lure drivers into complacency and distraction by claiming to be something it is not. Exactly zero people expect AEB to prevent a collision, because it doesn’t bring the car to a full stop–it only reduces the speed. Thus, while Autopilot contributed to a fatality, AEB cannot.

Auto steer didn’t kill the Florida driver nor thru its inaction. It was the automatic breaking system that failed to recognize the truck. The auto steer portion of the auto pilot system had no function in this death

AEB only works at SLOW SPEED not high speed. On all cars out there that have it.

The 2017 Volt has AEB. It only works BELOW 37mph. If the speed is higher than that, the software ignores stationary objects completely.

Saying AEB is equivalent to TACC detection is a stretch. They are two separate subroutines. They may use some of the same hardware, but, they are not the same in action.

I know we’re you going with this, well the watching Harry Potter accident in FL was product of a failure got will bc AEB fail to recognize the middle of the trailer or tha autopilot. You can not have both ways, if you said that FL was autopilot failure well this near accident in NY was autopilot success, if you said was only AEB well same in Florida. Autopilot is composed for several systems AEB is a part of it.

AEB has one clear mission avoid a collision, AP is more a convenience, but the two can work together because AEB is needed for AP and the AP can perhaps find a solution to save AEB from failure by turning around an obstacle if time is to short to brake in time. Although the AEB could be made to do that independently of AP. Tesla likely find it easier to merge the two in one. Both are separate things though the one is a security enhancement the other a convenience option for those that want it.

No witnesses, no corroborating evidence. Who’s to say this isn’t just a Tesla fanboi making this up?

Now if the PEDESTRIAN involved had written the post, that would be different.

Elon says he checked the logs. How dare you imply that Elon is lying. Every Tweet from Elon is 100% accurate and the 100% the truth. Elon is always right!

Oops! Elon Tweeted that this happened in NY, but the email clearly says this happened in Washington DC. I guess Elon forgot to double checked the location where the incident occurred.

It truly amazes me, it really does, that a hardcore Tesla basher who constantly writes posts about Tesla containing half-truths and/or outright lies, and in your case, sven, someone who knowingly spreads a smear campaign based on fabricated evidence, would point the finger at someone else over a question of truthfulness.

One thing is certain: Despite the level of hype coming from Elon, we can certainly trust what he tweets far more than what hardcore Tesla bashers like you post.


Are you stupid enough to think that Elon personally checks logs?

There are cars on the road for 28 $ with pedestrian systems…

Actually Volvo has had this as a standard on all cars for many years.
But those cost more than 28$.

Brakes, brakes, brakes. Gimme a break from “breaks”.

There loosing they’re breaks. 😉

The only way to really tell if Autopilot is safer than non-Autopilot driving is not with anecdotes, good or bad, but to compare the death rate per unit of vehicle miles traveled.

Currently, we’re at about 140 million VMT driven with Autopilot. The estimated annual mileage death rate for all cars is about 1.3 deaths per 100 million VMT


thus, we would expect about 1.8 deaths in 140 million VMT if Autopilot driving had no effect, and we have one. So, it’s too soon to say if Autopilot has had a meaningful impact of safety.

We need to examine the number of Autopilot fatalities beginning no earlier than something like 500 million VMT. Until then, we don’t really know.

This is how Musk wants you to think, but it’s completely bogus. I will stipulate a Tesla on Autopilot is safer than a 15 year old Camaro with bald tires driven in the rain by a teenage methhead.

The proper comparison is a similar late model luxury car. IIHS tracks driver fatalities by model. The Mercedes C class has ~0.07 driver fatalities per 100 million VMT. By this measure Tesla on autopilot is 10 TIMES WORSE. The autosteer sample size is way too small to be meaningful, but the early data favors Consumer Reports over Musk.

Agreeing: isn’t the non-AutoPilot Model S supposedly already one of the safest cars ever? Really needs to be an apples-to-apples comparison getting rid of as many systematic differences as possible (e.g. you probably even need to constrain the comparison to have the same weighted average of road types and weather conditions where AutoPilot gets used).

Publicly available statistics are insufficient, and what systematics have been considered in Tesla’s public statements are not clearly stated.

Gene said:

“…isn’t the non-AutoPilot Model S supposedly already one of the safest cars ever?”

Good point.

Elon playing fast and loose with statistics in his claims about the safety of Autopilot are definitely a case of “Figures don’t lie, but liars figure.”

Fortunately, it seems pretty clear that Tesla is improving the capabilities of Autopilot (and AutoSteer) fairly rapidly… much more rapidly than the NHTSA’s study and analysis will be able to react to. However safe Autopilot/AutoSteer is (or isn’t) today, we can be pretty confident it will soon be even safer.

Leptoquark said: “The only way to really tell if Autopilot is safer than non-Autopilot driving is… to compare the death rate per unit of vehicle miles traveled.” Well that may be the easiest way, but actually it would be far better to include near-misses like this one. That would give you a far higher amount of data on which to base a conclusion. The death rate alone is much too low, thankfully, to form the basis of an informed opinion opinion at this time. Of course, using data from all near-misses would be an ideal case, but not a practical one. Unfortunately, such near-misses are likely to go unreported. If Tesla really wanted to conduct a study, they might team up with Stanford or some other university, and poll Tesla car owners to ask if they’d like to opt-in to a study. For study participants, if any car had an AEB incidence, the owner could be e-mailed and asked to describe the incident. The study team could then assess whether the incident possibly avoided an accident, or was a false positive, or fit in some other category such as “undetermined”. Of course this would be partly or mostly subjective, but… Read more »

As noted above, this was AEB which is a great safety feature. No one has a problem with AEB, lane departure warning and similar collision avoidance systems which wait in the background, always watching out for danger and ready to take action when a wreck is otherwise unavoidable. Hurrah for AEB, hurrah for Tesla and the other carmakers who implement it.

None of this has anything to do with Tesla’s aggressive implementation of Autosteer, which is NOT a system that waits in the background but one that takes over the key driving function and encourages the driver the disengage.

If you’re gonna split hairs that fine, then the one fatal accident in which a Model S ran underneath a semi trailer wasn’t the “fault” of AutoSteer, either. AutoSteer and Automatic Emergency Braking are “two separate things” according to your criteria.

But actually they’re not independent at all. They both depend on the same sensor hardware and on the Autopilot software’s ability to “see” obstacles and initiate braking.

You don’t get to have it both ways.

You are correct, autosteer did not directly cause the Florida wreck. That was caused by the semi driver’s failure to yield right of way and the Tesla driver’s failure to watch the road.

My issue with autosteer is that it ***encourages*** drivers to take their eyes off the road. No matter what the manual and activation screens say, the feature itself encourages dangerous behavior. As such, it contributed to the accidents in Florida and Pennsylvania.

Contributing factors are sometimes ruled more at fault than the direct causes. The jury in the recent headline case here in TX assigned 25% of fault to the driver who rear-ended the victim’s car (direct cause of accident), 20% to the parent who didn’t properly restrain his 10 year old son and 55% to Audi whose seat structure failed (despite meeting the NHTSA standard for seat structures).

I disagree with that jury, but that’s beside the point. I care about safety. A feature which encourages Tesla drivers to take their eyes off the road makes the roads more dangerous for me and you. Such features should be disabled except in Level 4 cars.

The Tesla driver was using Autopilot to drive on a city street and a pedestrian stepping in front of his car. There are two possible outcomes:

1) The Tesla stops before hitting the pedestrian, and Elon Tweets praising how well Autopilot worked in preventing this accident (even though it was actually Automatic Emergency Braking and not Autopilot that prevented the accident).

2) The Tesla hits the pedestrian, and Elon Tweets how the driver is misusing Autopilot in situation it was not designed to handle: the Tesla driver was using Autopilot on a city street when Autopilot is to be used only on a limited access divided highway.

Elon should be telling people NOT to use Autopilot on city streets, because it’s designed to use on divided highways. Elon should also be telling people that when driving on city streets with Autopilot off, the Automatic Emergency Braking system is designed to stop the car and avoid hitting a pedestrian in these type of situations (slow driving on residential or city streets).

Once again, sven, you are “pushing the envelope” hypocrisy to insinuate Elon Musk is lying in his tweet, while very obviously doing that yourself. In this case, your disinformation and mendacity are painfully obvious.

You know perfectly well, sven, that it’s AutoSteer which is not intended to be used non-divided roads. Not Autopilot. Some Autopilot features are always on and cannot be turned off. And again, you know that perfectly well.

Do you really think you’re fooling anyone with such obvious B.S.? I think very few if any regular InsideEVs readers are that easily fooled.

My first thoughts were, pardon the point form,

-Driver was distracted by sirens and did not slow down.
-Driver had trouble seeing with glare in eyes and did not slow down.

I call BS on the glare, this was a well lit city street, not some dark, unlit, rural road of a moonless night.

Perhaps if you didn’t so casually and routinely post B.S. about Tesla yourself, sven, then you wouldn’t be so quick to accuse others of doing that.

The report in the article clearly says “…it was night time, there was a lot of glare from the headlights of oncoming cars…”

I see no good reason to doubt that report. On the other hand, there are plenty of excellent reasons to doubt everything you post which is even remotely related to Tesla Motors and its cars.

Bite me troll. Your ad hominem attacks are as lame as ever.

Have you never been in the big city, farmboy? Unlike rural Kansas, Washington DC has plenty of bright streetlights. On a main thoroughfare like New York Avenue, there is enough ambient light from streetlights that drivers don’t get blinded by the glare of the headlights from oncoming traffic.


The amount of street light is irrelevant if it doesn’t cause other drivers to turn their lights off.

I’ve driven on many late nights in the city and seen this.

sven sputtered:

“On a main thoroughfare like New York Avenue, there is enough ambient light from streetlights that drivers don’t get blinded by the glare of the headlights from oncoming traffic.”

Seriously, dude?

Now you’re making us seriously question that you’ve ever driven at night thru a large city. It only takes one single car with its brights on to dazzle you with glare at night, when it gets close to your car on any street with two-way traffic. The presence or absence of streetlights is pretty irrelevant to that.

How is it possible for you to not know that?

* * * * *

I dunno whether you’d consider the Greater Kansas City area, where I now live, to be a “big city” or not. It’s not among the largest 25 cities in the U.S., but it’s certainly much larger than the various small towns where I spent my childhood!

Fly-over state… Doesn’t count.

(I keed, I keed!!)

Yet more proof that what I said in another story is true.

Sven clearly does not own a car.

Perhaps it’s just me but this doesn’t sound like autopilot did anything. I’m betting that some collision avoidance, just like on many other cars was at work here. Or are we calling anything that has to do with anything these days on a Tesla Autopilot?

Don’t get me wrong, bravo but still doesn’t change the fact that Autopilot as it’s being marketed isn’t ready for prime time.

Short answer: Tesla Motors includes all the automated or semi-automated driver assist features are under the “Autopilot” umbrella. This even includes ACC (Adaptive Cruise Control).

Quoting from an article on Tesla Autopilot at

The first milestone is ADAS or Automated Driver Assistance Systems. An ADAS system assumes that the driver is in control of the car for most of the time but will provide assistance or emergency capabilities. This includes the likes of AEP [sic] (Automatic Emergency Braking), Adaptive Cruise Control, Collision Avoidance Systems and similar features. This is something that is now part and parcel of most high end (and mainstream) vehicles with a select few (including Tesla) even having advanced ADAS like Emergency Auto Steering.

Read more:

Step away from the twitta…

When a slow media day and a few handfuls of mobile device characters from a multitasking executive has the capacity to easily move markets, I wonder if auditors consider this a reportable risk.

Certainly a more professional job of corporate messaging could have done a better job making use of this owner submission than Elon managed to do there.

Perhaps if defending the feature and technology is so important for Tesla, an adjustment to the marketing mix is in order to remove the upcharge on the more price-sensitive models.

This story is a distraction from the BEV nature of Tesla’s vehicles, perhaps short-lived, but I wonder if the roles of visionary and of leader are unusually at odds here.

Elon even got the location wrong in his Tweet, saying this happened in NY. Per the owner’s email, the incident occurred in Washington DC, and not in the Empire State. 😉

So, your argument is that we should ignore the fact that Tesla Autopilot probably saved someone’s life, because Elon confused “New York Avenue” in Washington, D.C., with “New York City”.

Got it.

Quite the hate-fest going on here.

Can’t help but notice new usernames popping up here to pile on the anti-Tesla Fud.

This simply illustrates how desperate the shorters and haters are getting as Tesla gets into position to become not just a major company but perhaps a dominant company in the next decade.

Dear Consumer Reports:

This incident is exactly why it was wrong of you to call upon Tesla Motors to disable Autopilot in its cars.

You’re just spreading more FUD and lies. Stop trying to twist the facts.

Consumer Reports did NOT call on Tesla to disable Autopilot or Automatic Emergency Braking. Consumer Reports did call on Tesla to temporarily disable Autosteer until it updates the program to verify that the driver’s hands are on the wheel at all times. The current version of Autosteer allows drivers to have their hands off the wheel for over two minutes at a time. Below is the exact quote for the Consumer Reports article.

“Consumer Reports Calls for Tesla to Do the Following:”

“- Disable Autosteer until it can be reprogrammed to require drivers to keep their hands on the steering wheel.

sven, whenever in doubt think:
“the FUDiot does Not actually exist. scroll”

it is simply Pointless to go on – no matter what you post to inform, enlighten, or just (like your detractor) mutter and gripe about, Malfoy, Crabbe and Goyle are GOING to respond in any mal-informed manner that suits their Mood, and many regular readers need only see the ridikulus (intentional misspelling, in case your superior intellect gets ahead of common sense, Malfoy) call-sign to INSTANTLY scroll past the latest multi-paragraph missive. (Cute self-impressed Icon, yep, scroll)

I cannot for the life of me understand why one Respected poster here has joined the FUDdite bandwagon, but Outside of Tesla articles, I’ll continue reading his thoughts, but MCG? Never, Ever again and deeply appreciate each second of my life not wasted on his/their self-serving, chest-thumping mental-masturbatory excrement that constantly escapes from what he/they believes to be a reasonable mind.

sven said:

“You’re just spreading more FUD and lies. Stop trying to twist the facts.”

This thing where you accuse someone else of what you frequently do, but they never do…

Not working for you, dude.

sven continued sputtering: “Consumer Reports did NOT call on Tesla to disable Autopilot or Automatic Emergency Braking.” Quite aside from your faintly amusing, and absurdly desperate, attempts to paint my positive comments about Tesla as negative “FUD”, sven, that statement is factually incorrect. To quote from the article you yourself linked to: ” ‘Autopilot’ can’t actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time. Tesla should disable automatic steering in its cars until it updates the program to verify that the driver’s hands are on the wheel.” But hey, sven, nice truth-twisting there in quoting that last part out of context so the reader can’t see that it’s “Autopilot”, and not “AutoSteer”, that Consumer Reports is calling out Tesla to disable. That sort of disinformation, that kind of half-truth, is something you’ve practiced until you’re rather good at it, aren’t you? So, congratulations… I guess. Now, to be fair to Consumer Reports — despite the fact they’re not being fair to Tesla Motors — CR did later in the same article call upon Tesla to disable AutoSteer, specifically. However, before that, the article refers to Autopilot no less… Read more »

I believe your self-important choice of words, “Did you not see my previous response” ignores (to your self-serving benefit) the Very Real Possibility that sven is simply the last regular poster here that does not instinctively scroll past every single thing that you write as they have learned that self-important mega-posters simply take too much time to read, particularly when they are neither informative, nor remotely humorous or in any way enjoyable — there is no benefit in doing so.
Can you let us all know the donation site for “H-help buy P-P-Policeman a Life” TIA.

I actually find this case interesting.

If the driver had braked as he should by reacting to it last seconds, it could have hit the pedastrian by braking late or NOT enough. By braking manually, it would have disengaged the Auto Brake system.

So, the “slow reaction” of the driver is actually a blessing in this case.

Compared with the case in LA crash where the driver thought the system isn’t reacting so she stepped on the brakes but didn’t brake hard enough or was too late. In that case, she “was blamed” for taking it over manually…

Now, my question is really can we trust auto-brake 100% of the time? If we can’t, then when do you brake yourself vs. letting the car do it.

In this case, it saved a life. In the LA case, the driver didn’t trust it but end up in an accident…

The uncertainty is what bothers me.

I am glad it saved a life.

P.S. I hate it when pedastrian cross the road thinking they are above the law and all drivers should yield even if the drivers can’t see them…

This isn’t D.C. alone.

My car doesn’t have any of Tesla’s features. It does have cruise control which I don’t use because I can get more MPG by driving myself.
I’m not sure I’d like to leave the steering of my car TO the car even if I had that system. I’ve been driving for over thirty years and have had too many electronic things fail in the past.
This AEB system would be a nice feature to have as standard, purely for those few moments when your eye is taken off the road which we ALL have from time to time.
I have had many times a similar system in several cars over the years. Sadly not so ‘passive’ as current systems. It’s called my mother, who is a nervous passenger and tends to shout “WATCH OUT FOR…” and “SLOW DOWN” a lot, normally when the object in question is hundreds of yards in the distance…

Autopilot is a term Tesla applies to four driving features. It would have been AEB that averted the pedestrian collision and it would be the TACC and Autosteer that contributed to the fatality in Florida. It would be useful for your coverage to make this clear. AEB is offered in other luxury brands, too. Musk didn’t invent it and he wasn’t the first to introduce it, either. AEB is legit. TACC and Autosteer, branded “Autopilot” is misleading and can encourage drivers to take risks they should not take.

And before anyone rants about crash statistics, keep in mind that you can’t compare accidents between the roughly 100,000 Teslas (the ones equipped with the sensors than earlier models lack) worldwide and the larger total fleet of vehicles (about 250 million in the U.S. alone). Expensive luxury cars are safer in general, and their drivers tends to be safer drivers. A better comparison is crash statistics of comparably priced and equipped vehicles to the Model S and Model X. I’m not saying Teslas are more or less safer than comparably priced and equipped cars, I’m just saying the crash stats the company uses is a misleading apples-to-oranges comparison.