NHTSA Probe Into Fatal Tesla Crash Expands With New Demands


Scene Of May 7th Tesla Model S Fatality (via ABC News/Bobby Vankavelaar)

Scene Of May 7th Tesla Model S Fatality (via ABC News/Bobby Vankavelaar)

Model S

Model S

The fatal Tesla Model S crash has been on the NHTSA’s radar since Tesla first informed the agency of the incident way back on May 16, just nine days after the crash, yet it took the NHTSA quite some time (and thousands of international headline news stories) before it opened a formal investigation into the matter.

Now, the NHTSA is asking for more information from Tesla.

Specifically, as Automotive News reports:

“U.S. highway safety regulators have demanded that Tesla Motors Inc. hand over detailed information about the design, operation and testing of its Autopilot technology following a May 7 fatal crash in which the system was in use.”

“The agency’s nine-page letter dated July 8 was made public Tuesday. It requires the Palo Alto-based automaker to file responses in the coming weeks.”

It seems that the NHTSA is actually seeking any and all information related to Autopilot, including design parameters, results from Tesla’s independent testing, etc. Automotive News adds:

“The agency is seeking details of all design changes and updates to Autopilot since it went into use last year, and information on whether Tesla plans updates in the next four months.”

“It also wants records of how many times the system told drivers to put their hands on the wheel and how often that led to the car automatically reducing vehicle power.”

“Other questions include what Autopilot does when cameras and sensors aren’t working properly, how the system was tested, and how it filters out “false positive events and interventions.”

So, basically, the NHTSA is seeking all Autopilot-related data and information and expects Tesla to openly provide this info, which we assume Tesla will once it has collected all the essential information.

Automotive News says that Tesla wouldn’t comment on the NHTSA letter, which is typical during an open investigation.

NHTSA spokesman Bryan Thomas did offer up a comment, saying that the agency “has not made any determination about the presence or absence of a defect in the subject vehicles.”

Source: Automotive News

Categories: Crashed EVs, Tesla

Tags: ,

Leave a Reply

76 Comments on "NHTSA Probe Into Fatal Tesla Crash Expands With New Demands"

newest oldest most voted

Who wants to take bets that by the end of the year, the feature known as “Autopilot” won’t be called “Autopilot” anymore?

I don’t see the name as a problem. As I recall the cruise control was called “Auto Pilot” on a 1958 Imperial. Back then “more” people took responsibility for their own stupidity now everyone it seems is looking for a meal ticket.

with cruise control, you still have to steer the car, so a person using cruise control wouldn’t have come to believe that he should be able to read a newspaper.

When cell phones will be forbidden because stupid people text and drive. I think watching a movie is way more irresponsible.

Daniel said: “I don’t see the name as a problem. As I recall the cruise control was called ‘Auto Pilot’ on a 1958 Imperial.” There was an urban myth created when cruise control was something new. The story was this: An RV owner took his new RV out onto the highway, set the cruise control,then left the driver’s seat, going to the kitchen area of the RV to make himself a sandwich. (Naturally, a horrible accident shortly followed. This is an urban myth, after all!) That’s not a true story, but it does underscore the reaction some people have to new technology; their anxiety about it. Cruise control doesn’t make it physically possible for the driver to become a passenger; to take his hands off the wheel and his eyes off the road for extended periods. We’ve seen plenty of videos of “drivers” of Tesla cars with AutoSteer doing precisely that. If that early cruise control named “Auto Pilot” had actually enabled people to do the stupid stuff we’re seeing (and reading first-hand reports of) Tesla drivers doing, then perhaps — probably — there would have been a public outcry about it at that time. I’m a big advocate for… Read more »

Bro, it’s the right name for the right features. Changing the name wont stop stupid people from turning the knob to 11.

Someone (jdw) over on the TeslaMotorsClub forum made a profound comment about dumb people:

“Think about how dumb the average person is and then realize that half the population is even dumber …”


That’s a George Carlin joke. Google it if you want.

I think they should stop using the word automobile, too.

THE First paragraph seems to be trying to laud Tesla for hiding the accident for 9 days while attacking the NHTSA for the pace of its investigation.

Interesting prism to read the story through.

Might be a little observer bias.

Remind us again how long it took GM to tell anybody about faulty ignition switches?

Why? Is this a GM article?

Well, years, and ensuing deaths and accidents are GM’s fault. They tried through declaring bankruptcy to vacate their responsibility for these deaths and accidents, though a recent judgement against them will allow individuals to sue. Not good news for GM.

So I see the pertinence of your comment.

Oh and for GM…Like the gambler said, read ’em and weep.

Regarding the ignition switches, whose only ‘fault’ was that, going over a bump with a huge key ring would turn them off, GM apparently got in trouble legally when they tried to address the problem.

Any other automaker like Ford or Chrysler would say that only car-maker Supplied keys for this vehicle are to be used, and, since the corporation has no control over how large or how much turn-off-torque an oversized keyring has, the customer shall hold them harmless if there are any keys on the keyring not supplied by GM and/or having nothing to do with driving the car.

Any other car company would have said there is no issue here at all.

Yes, it’s the fault of the customer. Although 5 billion dollars in judgments and another possible 10 billion or so, indicate the courts did not find that answer to be the case.

I agree. Editorialization is rampant here.

Tesla knows that regardless of facts, as they have “friends” in the White House, they are going to be treated with kid gloves. Maybe the NHTSA will invent some dumb new warning label or require an owner to take a short TRAINING COURSE before unlocking any auto pilot.

On second thought, requiring a short training course in autopilot might be a great way to prevent autopilot abuse!

I have seen several youtube videos of people who trust autopilot so much they do stupid stuff. Some people text or read internet stuff, or allegedly watch Harry Potter videos (that could kill you, one way or another). 🙂

Jim Whiteheads said:

“Tesla knows that regardless of facts, as they have ‘friends’ in the White House…”

Presumably these “friends” are any staff left over from the Bush Jr. administration, as the DOE loan program was set up under that previous administration.

Or, to put it more succinctly: That appears to be a leak from the hard-right echo chamber, and it’s so far removed from reality that it’s “not even wrong”.

Quoting from a Tesla blog:

The loans are part of the Advanced Technology Vehicle Manufacturing Program, which provides incentives to new and established automakers to build more fuel-efficient vehicles. Created in 2007 and appropriated in September 2008, the $25 billion ATVM aims to reduce America’s dangerous dependence on foreign oil and create “green collar” jobs. The program is entirely unrelated to the stimulus package or the so-called “bailout” funds that General Motors and Chrysler have received.

As a reminder, the Obama administration began in January 2009.

Why are you trying to brings facts into this? (Nice job, BTW).

Good work P2

We keep getting reports on these Autopilot accidents that indicate that the accidents were caused because the vehicle operators were not operating the vehicles in accordance with Tesla guidelines. I think those arguments are total hogwash and are just provided by people that don’t want to admit to current inadequacies of the Tesla Autopilot systems. I want autonomous vehicles to succeed as much as the next guy but we have to admit that Tesla Autopilot system has some serious problems right now that can’t be justified by Tesla. In every case it was reported that the operator was operating the vehicle under Autopilot in unsafe or illegal condition, but still Autopilot was allowed to activate. In the fatal accident the operator was travelling well in excess of the speed limit, perhaps even in wreckless driving speeds, but yet Autopilot was allowed to activate. Autopilot should NOT allow to the vehicle to be operated illegally and should not be allowed to activate when road conditions do not permit safe Autopilot operation. If an operator wants to speed or operate the vehicle illegally or when road conditions do not permit safe Autopilot operation then the operator should be required to deactivate Autopilot… Read more »

> wreckless driving speeds

If only it were so…

Not paying attention while operating under autopilot is reckless.

25 mph above the posted speed limit is considered reckless driving in some states, so you get a reckless driving ticket instead of a speeding ticket. That’s what I meant with reckless driving speed.


I realize. I was poking fun at the misspelling of reckless as wreckless, which has almost the opposite meaning.

Funny…and some people are feckless.

Plus watching a movie should be a felony

Yes, but so is allowing Autopilot to be abled in situations when it is known to drive recklessly.

Really? There are a lot of “reckless” things that CAN be done with just about anything. Why does that fact only place the blame on Tesla. As unfortunate as the situation is….Josh Brown would have probably been one of the first people to tell you that what he [allegedly] was doing was strictly against the suggested use of AP. If used in the manner that Tesla states AP is to be used, the system is very safe.

Four Electrics said: “Yes, but so is allowing Autopilot to be abled in situations when it is known to drive recklessly.” To quote Oscar Wilde: “Everything in moderation, including moderation.” While I question Tesla’s decision to continue to allow AutoPilot/AutoSteer to be used on non-divided roads, where it clearly was not meant to be used, at the same time it would be an impossible goal to ask Tesla to prohibit use of AutoPilot/AutoSteer in all (or even nearly all) cases where it would constitute reckless driving. Here’s an example of why any such attempt will be self-defeating: We’ve already seen reports of people disabling the Model S’s safety features to force the car to drive itself (using AutoSteer) even with nobody sitting in the driver’s seat, despite the fact that there are two safety features to prevent this: The car must detect an adult’s weight in the driver’s seat, and it must detect that the driver’s seat belt has been buckled. Obviously some very determined idiots out there figured out how to defeat both safety devices, and bragged about being able to do so on the Internet. So: Everything in moderation. Yes, Tesla should take reasonable steps to prevent abuse… Read more »

Does the Tesla Autopilot work in conjunction with SatNav ?

I’m assuming not as this should be easier to overcome, also you would have the problem of maps needing updating in Realtime as roadworks would alter the layout of the SatNav’s.

No. It is just a glorified cruise control that steers the car within the lane that you are in and changes lanes when you tell it to and the coast is clear. That is it. You cannot put an address in your nav and tell it to drive.

Actually, it does. Autopilot uses GPS to reduce speed on non-divided roads. For the safety of all, though, it should turn itself off completely on those roads.

I disagree, where stop & go traffic happens on these roads. Frankly, that is the only “level” I would really miss from automation. Traffic isn’t really a problem for two reasons. One, the speeds are way down. Two, there is a car in front, where the Mobileye tracking makes the system much more predictable.

I think the greatest, most dangerous, challenges for the system are when a car actually isn’t in front of you, moving the same direction. I believe Tesla gave us the illuminating lane markings (when it sees them) and the vehicle graphic, to let us know the inputs to AP. As one, or the other goes away, manual monitoring becomes more hands on. That much is predictable.

pjwood1 said:

“I think the greatest, most dangerous, challenges for the system are when a car actually isn’t in front of you, moving the same direction.”

Yup. It has been reported that AutoSteer works best at lane-keeping when the car is in heavy traffic, surrounded by other cars. Sometimes the software can’t “see” the lane markings, and it appears to me that in such cases, the software uses the location of other vehicles to figure out where the lanes are.

Another strong indication of just how limited the AutoSteer software is, is that Tesla cars have a tendency to take an exit ramp off the freeway, rather than stay on the highway. Speaking as a programmer, this for me is a strong indication of how very limited the software is at being able to figure out just where the car is in relation to the entire road.

We’re still at the “crawl before you can walk” stage of autonomous driving software. All too many people posting on the subject seem to think we’re at the “walk before you can run” stage.

The v7.1 Autopilot update put in an new safety restriction that uses GPS data to automatically limit the Model S’s speed on non-divided highways to 5 mph over the speed limit. Tesla could have just as easily put in a safety restriction that completely prohibited Autopilot use when traveling on anything but divided, limited-access roads, but Tesla didn’t do that. Instead, Tesla tacitly approved the use of Autopilot on a non-divided highways choosing to put only a speed restriction on Autopilot when traveling on a non-divided highway. In other words, Tesla thinks it’s OK to use Autopilot on non-divided roads as long as you don’t go 5 mph over the speed limit. If didn’t want you to use Autopilot on non-divided highways, then Tesla could have prohibited or shut off Autopilot every time GPS data says that the car is not on a divided highway.


“Autopilot should NOT allow to the vehicle to be operated illegally and should not be allowed to activate when road conditions do not permit safe Autopilot operation.”

So who’s responsible if the driver WAS legally operating in the proper environment but then the road changed to conditions where it is not recommended for AP to operate and the driver fails to disable AP?

I’m not a big Tesla fan but clearly you have to go through enough warnings (also included in the manual but who reads those???) or prompts to make sure the owner should be aware.

No amount of warnings or prompts can protect against the Darwin award.

Autopilot already disables itself spontaneously under some conditions. This situation would be no different.

…..hence why they say to keep your hands on the wheel and stay alert.

As soon as you used the word ‘keep’, you showed your hand, and your attitude. And since it was only the second word, we already know that your observations are actually rants. The connotation of what you say indicates that there are rampant problems, when indeed even the nuance of a problem is just DAYS old, with respect to when we all heard of ‘all those problems’. I am SICK and tired of reading about the horrific speeds that allegedly were being used. FROM THE MANUAL: ‘When using Autosteer on residential roads, a road without a center divider, or a road where access is not limited, Autosteer limits the driving speed. The maximum driving speed is calculated based on the detected speed limit plus 5 mph(10 km/h).’ Since the accident that killed the driver took place on a road with side access, the speed was limited to 5 mph above the speed limit. It is actually quite a cool feature, and probably safer than most drivers in that regard, as the first speed limit sign (which can be missed) slows the vehicle down with a big warning about the limit imposed by autopilot. Those who have not experienced autopilot should… Read more »

I’m just commenting on what I read. The first article I read on the fatality accident reported an interview with a woman that stated she was traveling at 85 mph and the Tesla passed her like she was going backwards. The articles I read indicated that Tesla reported that the Model S was on Autopilot during the fatality crash.

That road where Joshua Brown killed himselve by speeding under a truck has some nasty features:

(1) It’s a divided highway (so Tesla’s speed limiter wouldn’t activate) but it has intersections (very dangerous)

(2) It doesn’t seem to have signs for the speed limit; you’re just supposed to know what the state speed limit is. Tesla’s speed limiter wouldn’t be able to detect the limit anyway.

I assume in your garage you have a hammer.
Is it stupid of the hammer manufacturer to allow you to throw it at someone? I’m sure somewhere in the fine print on your hammer documentation, there is a line that instructs you not to throw it at people…but you have the right to do so anyway. Yet no one would even think about suing the hammer company….weird.

My cruise control “is allowed” to disobey the speed limit, and I could possibly crash with it activated. Should it be outlawed?

Actually, the *driver* disobeys the speed limit when they set their cruise control. Cruise control is just tool, like AutoPilot, used by people.

If you intend to engage in illegal activities of any kind then you should not be able to activate the autonomous features of your vehicle while engaged in those activities. I know it’s impossible for older cruise control to provide these kinds of limitations but with smart systems like Autopilot it’s not. As autonomous features become more and more advanced we will only see discussion of autonomous features being able to perform illegal activities grow,

I think that eventually the governments will get involved and not allow any smart autonomous features that allow traffic violations. Actually I’m surprised the governments haven’t done so already. I think we are going to see a lot legality battles on autonomous vehicles over the next ten years.

I am surprised, as well…and given the pace of the technological development, and the power of corporate lobbying, I doubt that litigation will extend over the next 10 years…probably within the next 3-6 years.

Automakers are going to want to see laws sooner rather than later so that they don’t feel the need to play guessing games on what will be legal and for what they will be liable once their latest product hits the streets a year or more after development.

BTW the main reason governments are in such strong support of autonomous features on vehicles is to take control away from operators so that the operators no longer perform stupid, dangerous and illegal activities on the highways.

Yep…legislatures and regulatory agencies can’t sit on this forever.

So now Obama’s government goons are taking Tesla’s designs. I doubt they can improve them any, but that’s probably not their intent. Musk already has the best engineers in the world.

lol, exactly.
So now they (NHTSA) all of the sudden have “Auto Pilot” subject matter experts?

No, that’s one of the reasons why they are requesting all of the available information – to see what is going on, how well it was tested, etc.

The mission statement from the NHTSA site says, “NHTSA was established by the Highway Safety Act of 1970 and is dedicated to achieving the highest standards of excellence in motor vehicle and highway safety. It works daily to help prevent crashes and their attendant costs, both human and financial.”

So you and other’s statements implying Obama is somehow directly responsible for the actions of this agency, seems a wee bit thin at best.

There is nothing wrong with attaining a deeper understanding of a new safety technology you’re investigating, for the public good. That’s exactly what functional government is for.

“Man dies in Tesla using Auto Pilot”
“NHTSA investigates”
“Thanks Obama”

How did this turn into a “thanks Obama” thing?

If it’s not Obama, then it’s Trump…..lol

For some reason they get thrown under the bus for anything. 😛

Trump has never held a single political office, so the only experience he as at actually causing trouble is bankrupting companies, beauty pageants, and reality television.

And fake universities.


I love it when a real-estate crook, calls someone else a crook. Hilarity ensues! 😉

Also apparently he is, or was quite in demand with the ladies, as his spokesman Martin Miller, (lol) proclaims him to be:

Look, I don’t like Trump and may end up voting for Gary Johnson (I will certainly NOT vote for Hillary), but to be fair he does have some experience taking risks and creating thousands of good paying jobs for the employees of his companies. Just keepin it real.

I predict AP will be disabled till Tesla adds more warning prompts…

“WARNING: You assume all responsibility of the vehicle and everything else it damages. Do you agree?”

“WARNING: Tesla is not responsible for any damages incurred due to driver negligence and understanding this is BETA test software. Do you agree?”

“WARNING: All actions will be recorded to prove your Monkey Azz did not follow all, if any, instructions.”

I don’t know the prompts but I’m pretty sure you are informed enough times that you need to follow the instructions/guidelines.

Maybe this site can get screenshots of all the prompts and post them?

Yeah, because reading all those and agreeing to them all while driving on the highway is safer than just engaging it? You agree once the first time you activate it and that’s it. Who wants to deal with multiple nanny screens every time you get in your car, or worse yet, while you’re driving?

You seem to have experience on turning AP on.
Can you do us all a favor and post a screenshot of the prompt or a video of the process to enable it?

I’m sure everyone would be interested in this…… No?

Bryan Thomas did offer up a comment, saying that the agency:

“has not made any determination about the presence or absence of a defect in the subject vehicles.”

unlike most commenters.

I hope this doesn’t become a turf battle, between NTSB, and NHTSA, at Tesla’s expense.

Referring to NHTSA agents as “goons”. Really open minded of you.

Where there is a fatal accident involving new technology, it is appropriate to investigate.

And if there wasn’t an investigation, you’d probably be back in a few years, yapping about how Elon’s computerized Autopilot goons can assassinate anyone he chooses and nobody did anything about it when the first fatality took place…


In my humble opinion, no AP system is ready for prime time yet. They are putting all drivers, pedestrians, bicyclist etc. at an unnecessary risk.

It’s time auto makers can come up with a standard for cars to communicate with one another in real time. All control systems use feedback. Sophisticated ones use lots of handshaking. Tesla’s was deployed way too early. We’re a few years out yet where this necessary extra layer of safety be added.

How about starting small with a cruise control sync from car to car to car so our cruise control can Really control?

It’s the future; incorporate now.

Funny to see the attempts at media spin. The NHTSA did not “demand” anything:

“This letter is to inform you that the Office ofDefects Investigation (ODI) ofthe National Highway Traffic Safety Administration (NHTSA) has opened Preliminary Evaluation PE16-007….and to request information to assist us in our investigation.”

Since this seems like the initial communication, not sure how you term this an “expansion” or interpret this as a “demand”.

The full letter can be found here: http://www-odi.nhtsa.dot.gov/acms/cs/jaxrs/download/doc/UCM533397/INIM-PE16007-64338.pdf

Correct me if I’m wrong but didn’t TESLA a few months ago offer ALL anonymisee Autopilot data to governments for free?

Next FAA will remove airplanes cockpit door bc an idiot in France crashed the plain in purpose. Great we will be in the dark ages soon. Long live to dumb people.

I wondered what the context was of how often NHTSA requested information from car makers. Whether this was unusual to request communications from manufacturers, or if this is business as usual common to gas cars.


I gave up trying to put a number on it, when I saw the report of communications with car makers was over 4,000 pages long, with a dozen or dozens of cars listed on each page.

Way too many data points to even attempt to quantify.

NHTSA getting data from car makers doesn’t seem to be all that rare or unusual, but I’m not going to try to sort through the tens of thousands of data points to try and make a comparison to gas cars. Just too much ICE car data points to quantify.

This is part of a new, emerging technology – it would be reasonable to say that requests this extensive are common when the possibility exists that new tech in a vehicle has gone awry.

Unfortunately we live in an era when government thinks it knows best for everyone and enacts laws to protect stupid people from themselves… autopilot may in fact be at risk, justifiably or not

Stupidity isn’t the issue – it is business accountability. When you offer a new tech, you need to be clear to people.

As I see it, the question here is not whether people are dumb, but whether Tesla Motors had 1) sufficiently tested AP prior to release, and 2) provided sufficient safeguards and/or warnings to the user.

An example is warning labels – when the hair curler operation warnings read “do not operate while sleeping”, it is a gov’t mandate that the company provide the warning.

However, it is NOT illegal to use the hair curler while sleeping. Nobody can arrest you for that.

Furthermore, the company is not liable for any damages or loss of life caused by operating the hair curler while sleeping, as long as the product functioned as designed (no short circuits) and as long as the warning label was there.

Government is not trying to save people from themselves – that’s impossible – but rather, to make sure that people are sufficiently aware of what is in the products they are buying, and what the capabilities and hazards are of the products they use.

After that, it’s your problem.

Love all these comments from people that not only don’t have a Tesla with AP, they don’t know the current state of TACC technology in other manufacturer’s cars.

There is no reason to have car-to-car communications to make a simple (relative to full autonomous) function work reliably.

Thinking that TACC is a true stepping stone to full autonomy is kind of a stretch by Tesla. Nobody else is going that direction.