NTSB “Unhappy” With Tesla’s Release Of Fatal Model X Crash Info

APR 2 2018 BY ERIC LOVEDAY 29

The National Highway Traffic Safety Board is none too pleased by Tesla’s decision to release Model X fatal crash info ahead of its full investigation.

NTSB spokesman Chris O’Neil released this statement on the matter yesterday:

“At this time the NTSB needs the assistance of Tesla to decode the data the vehicle recorded. In each of our investigations involving a Tesla vehicle, Tesla has been extremely cooperative on assisting with the vehicle data. However, the NTSB is unhappy with the release of investigative information by Tesla.” 

The NTSB simply wants to be sure of the full findings prior to information going public, but Tesla apparently sees the situation as requiring immediate response.

Related – Update 5: Video From Immediate Aftermath Of Tesla Model X Crash

Tesla has jumped the gun like this in the past, but no harm was done. We don’t expect any major issue to present itself this time either.

Tesla CEO Musk has issued a Tweet (imagine that) in response:

Here’s what Tesla released (and the NTSB is unhappy over) back on March 30:

An Update on Last Week’s Accident

Since posting our first update, we have been working as quickly as possible to establish the facts of last week’s accident. Our hearts are with the family and friends who have been affected by this tragedy.

The safety of our customers is our top priority, which is why we are working closely with investigators to understand what happened, and what we can do to prevent this from happening in the future. After the logs from the computer inside the vehicle were recovered, we have more information about what may have happened.

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.

Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.

In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.

Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.

No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.

In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.

Source: Washington Post

Categories: Crashed EVs, Tesla

Tags: ,

Leave a Reply

29 Comments on "NTSB “Unhappy” With Tesla’s Release Of Fatal Model X Crash Info"

newest oldest most voted

Tesla for the slam dunk. NTSB best get off the bench.

(⌐■_■) Trollnonymous

And the NTSB are “Subject Matter Experts” on deciphering Tesla’s (or anyone Else’s) computer logs?

NTSB might not be SME on Tesla’s logs, but Tesla was out of turn here. Especially if lawsuits are brought up against Tesla. Tesla’s statement was worded very carefully. They told us his hands weren’t on the wheel potentially hard enough to be detected, yet autopilot had not disengaged. What was autopilot seeing to make it think it was okay to proceed? Was their glare in one or more cameras? Were road markings visible, etc…. Context and conditions matter just as much as Tesla’s simple statements of what the driver was doing. What was autopilot doing?

“They told us his hands weren’t on the wheel potentially hard enough to be detected, yet autopilot had not disengaged”

What article are you reading?
“In the ***moments*** before the collision,…. Autopilot was engaged …
The driver had received several visual and one audible hands-on warning ***earlier in the drive***”

Nothing in Tesla’s release states that the drivers hand were not on the wheel at the time of the crash, and indeed engaging “moments before the crash” would never generate a hands on wheel warning nor Autopilot bringing the car to a stop (like in the drunk driver napping on the bridge in SF).

Not taking any blame away from the driver – or Tesla. People are too quick to form conclusions based on minimal information and media hype. There is not enough information available to the public at this time to reach a non-speculative position on cause or blame. This phenomenon applies to EV fires, AV crashes, Police shootings, and many other “hot” media topics.

The NTSB takes the time to collect and analyze ALL the available data before issuing statements and findings. They want Tesla to do the same and not try and spin the story.

Say what?!?! Is English a second language for you? It’s pretty clear: Tesla’s statement that “…the driver’s hands were not detected on the wheel for six seconds prior to the collision” is Tesla saying “The evidence we have indicates the driver was not touching the wheel at the time of the accident, nor at any time for (at least) six seconds prior to that.” There are lawyer-like press releases from Tesla which need to be carefully parsed, but this ain’t one of them! In this part of Tesla’s public statement, Tesla appears to be reporting what information it has as clearly as possible, and carefully not overstating what the data indicates, as any good engineer or scientist would do. Now, one might suggest that other parts of Tesla’s statement, such as stating that earlier in the same drive, the driver ignored repeated visual warnings about remaining in control of the vehicle, were cherry-picked by Tesla to suggest that the driver wasn’t as attentive to driving as he ought to have been. But the actual statement “the driver’s hands were not detected on the wheel for six seconds prior to the collision” seems as clear as one could get based on… Read more »

The NTSB is trying to have Tesla not sway the public putting out detailed information that cannot be validated as 100% accurate or truthful.

A person died while using AutoPilot and the NTSB is trying to figure out how they are going to regulate a world where the manufacturer can possibly tamper with the ‘black box data’ and mislead authorities.

Expect greater regulation coming from the government on these matters.

Expect governmental regulating agencies to have less and less impact on public perception, as the 24-hour news cycle dominates the spread of information in the modern world. People accused of crimes are tried, convicted, sentenced, and punished by the news media within 24 hours after a crime was committed, even though forensic examination of the evidence may take weeks, and the actual trial may happen only months or even years later. Investigative agencies, such as the FAA conducting investigations of airliner accidents, have similarly found themselves being pushed to produce results of their investigation, far sooner than they have in the past. As I recall, the FAA has within the past few years been pushed into releasing preliminary results, without waiting for all the evidence and facts to be thoroughly examined before issuing a public statement about its conclusions. Similarly, the story here was already out of control, judging by all the wild speculation seen in comments here on IEVs, and we can be sure that less EV-friendly social media forums had far more negative comments posted on them. Tesla had to release the info it has as soon as possible, to have any control at all over what was… Read more »

From watching the video of another tesla, it looks like AP simply followed the lane marking for the wrong lane. If that is true, the AP just drove the car into the barrier.
As has been fully discussed elsewhere, having the driver take over does not work. Drivers get bored and quickly ignore the task of driving.

Poor dude most likely tried to correct at the last minute and ran out of time.

His brother claimed he had “trouble there 7 or 10 times before” which is odd on a couple levels. If it doesn’t work there two or three times you would think he would disengage it in that area going forward. But the brother also claimed that he took it to a Tesla shop for repair and may have thought it to be fixed, there and everywhere else.
Or the brother could be on a fishing trip think Tesla has deep pockets.
But another Tesla apparently had trouble with that spot a couple days ago as well. I think AP isn’t ready for the big time. People think that it is more capable than it really is. So far.

My guess is that he did nothing. The car simply ran into the barrier. He probably wasn’t paying attention and thus didn’t know where he was to know he should take over.
From watching the video it appears that the car follows the solid white line on the left as a lane. Since we know AP ignores solid immobile objects it simply drove into the barrier. It didn’t turn into it, but simply didn’t turn away from it. I expect they have the logs and the actual actions may come out if it goes to trial.

While my bolt has lane detection it does not work well and I do not trust it. It also beeps and tells you to put hands on the wheel in a second or two if you let go. IMHO the technology is not there yet.

“Poor dude most likely tried to correct at the last minute and ran out of time.”

No, Tesla’s data shows the driver’s hands were not on the wheel at the time of the accident, nor for at least 6 seconds prior to it.

What Tesla is saying is that Autopilot + AutoSteer were in control of the car when the accident occurred. We can be sure that if Tesla had any evidence to the contrary, any evidence that would suggest in any way that the driver might have been in control of the car, rather than AutoSteer, then they would certainly have included that in this press release.

Exactly, hands should be forced to be on the wheel with very short intervalls to keep the drivers attention on the driving (like other manufacturers are doing).
Teslas system is level 2 prentending to be level 3. This kills people by e.g. driving them under a truck or into a concrete wall.. while at the same time marketing it as soon to be level 4 or 5…
The “autopilot” or driver assist features additional provided safety, e.g. emergency breaking etc. would not be affected by forcing the driver to keep his hands on the wheels. Still achieving the lower number of fatalities as stated in teslas statement.

Josh Brown was know for sleeping at the wheel and was recorded by fellow motorist doing just that.

He was responsible for his shortcoming Hans!

Yes and no. Irresponsible of him yes but also from the manufacturer to enable people to do this. This could have easily have been prevented by having a more strict mechanism to check the driver is still in control, e.g. forcing the driver to put his hands on the wheel. Just like the other manufacturers do..

“Still achieving the lower number of fatalities as stated in teslas statement.” I find it very odd that you understand that Truth, you understand that use of Autopilot + AutoSteer makes Tesla cars safer, yet you are still advocating making AutoSteer useless by requiring the driver’s hands to be on the wheel at all times. Do you not understand that if a driver had to keep his hands on the wheel at all times, that almost no one would ever be motivated to use AutoSteer? Do you not understand that the advantage of that system — in addition to improving safety, which is not obvious to the driver — is that it allows the driver to leave his hands off the wheel for most of the drive… an advantage which is obvious to the driver? The issue here is that AutoSteer allows the driver to relax so much that he takes his attention off the road. Forcing the driver to leave his hands on the steering wheel at all times would not at all alter that problem. It would only mean that his arms would get tired in addition to his attention wandering, and also would likely get him rather… Read more »

I bet that in the self-driving development division at Volkswagen’s secret research bunker, no CEO or boardmember was not just giving the non-existant order to equip Volkswagen’s data loggers with impact activated self destruct mechanisms to prevent future investigations by government agencies. In fact, there is not even a secret research bunker or self driving development devision at Volkswagen, and the accusation that Volkswagen even has a CEO is an outrageous lie itself, made up by enemies of the ICE car-tel !

I think you are in the wrong forum Mr Off Topic.

Some Guy, Some Topic

LOL

Caltrans is the ones to most unhappy with.

Caltrans is a complete joke.

No matter how you slice it, unless you have driven a self-driving car you are forming opinions about something you know almost nothing about. It’s like discussing a book you have never read…

The simple hard facts are: (1) you are 370% less likely to be in an accident in a Tesla with autopilot than the average car (2) Some insurance companies are offering as much as 40% reductions in premiums for driving a Tesla. Hmmmm… I wonder why? :-))

Too many people are far too willing to make perfection the enemy of progress…

Driving cars is a very dangerous business. Here in South Florida people tailgate at high speed as if it was their god given right and murder far too many people on our highways. If all those people let the car drive (even just speed control & braking), the car would keep a safe distance, stop if the car in front of them stopped, and we would save a lot of lives.

Caltrans is hugely liable.

There is zero excuse for not replacing the safety device on the day it happens or set up barrels to mark it while it is not in place.

I found both the Uber-Tempe and Tesla-California crashes instructive about the limits of dynamic cruise control on both our 2014 BMW i3-REx and 2017 Prius Prime. The root cause is the relatively short range of the sensors monitoring the area in front of our cars.

When approaching stopped cars outside of the range, our cars do not know the relative closing speed. In these cases, the safe way is to manually brake early just as we would if there were no dynamic cruise control.

In traffic, best practice is to set the dynamic cruise control speed faster than the lead car so it remains in range. Doing this, our cars will safely come to a complete stop even at a traffic light and easily resume speed. In effect, dynamic cruise control is an electronic link to the lead vehicle. Coming back into range does not give enough time to resume following speed and maintain a safe distance.

These fatal accidents and associated videos give a clue so dynamic cruise control can be used safely and efficiently.

The low-resolution Doppler radar detectors used by various auto makers for their ABS and their dynamic cruise control, are not intended to detect stationary objects. They are intended to detect only moving objects.

So if your DCC does not detect a car stopped in the road in front of you, that’s no surprise at all.

Similarly, it shouldn’t be a surprise that this Tesla car didn’t stop for the barrier at the end of that concrete dividing wall. That’s a fixed object, and Tesla cars have to rely on camera images and very unreliable optical object recognition software to “see” such obstacles. That generates so many false positives that it can’t be used.

Did Elon just tweet an NTSB-uuuurrrrnn?

So long as Tesla is releasing only Tesla’s data, and not the NHTSB’s data, then I don’t see what the NHTSB has to complain about.

Tesla is operating by the standards of the 21st century, where information moves at the speed of a 24 hour news cycle. If Tesla had sat on the info until the NHTSB had concluded its investigation, that would have been long after the cries of “Tesla is hiding something!” were all over the internet. In fact, as it was, we already saw that insinuation in a few posts from the serial Tesla bashers which infest IEVs comment threads.

It’s entirely proper for the NHTSB to take its time, and conduct a thorough and full investigation. It’s also proper for Tesla, in the modern era, to release bad news as soon as it can. This is doubly true from the perspective of other recent bad news for Tesla. Far better, from a public relations standpoint, to release all the bad news at once, rather than to have it come out in dribs and drabs over weeks or months.

Translation: We want to be viewed as the only legitimate collector and distributor of data, because we are government. Unfortunately, since we are a massive bloated bureaucracy, we can’t possibly compete with instant information release of an agile technology company that voluntarily releases information without being funded through extortion. Therefor, we will condemn the release of information THAT ISN’T EVEN OURS so that we try to desperately appear relevant in an age that is increasingly questioning our legitimacy and worth.

Is there any written or unwritten convention or rule that companies being investigated by the NTSB can’t publish data first?

Usually companies don’t release information while investigations are pending because they worry about liability from the public, but I don’t think it really affects NTSB’s work in this case. Their findings will stand if they do the investigation correctly, regardless of what Tesla PR says.

I googled it a bit and it seems sometimes other agencies (like FAA) disagree with NTSB findings or during trials a jury may disagree with NTSB findings.