NTSB Preliminary Report On Fatal Autopilot Crash: Model S Travelling At 74 MPH

JUL 26 2016 BY JAY COLE 48

Scene Of May 7th Tesla Model S Fatality (via ABC News/Bobby Vankavelaar)

Scene Of May 7th Tesla Model S Fatality (via ABC News/Bobby Vankavelaar)

After a high profile fatal crash that involved a Tesla Model S while operating in Autopilot mode occurred near Williston, Florida in May, the NTSB announced earlier this month that it would open an investigation into the incident.

Today the National Transportation Safety Board issued its first findings, reporting that the Model S in question was travelling 74 mph (in a 65 mph zone) at the time of the accident as part of its preliminary report.  Also identified was the specific vehicles involved in the accident (a 2014 Freightliner Cascadia 53-foot truck tractor, and a 2015 Tesla Model S).

Model S

(via ABC News/Bobby Vankavelaar)

More of the specific details can be found at the above link, as today’s preliminary report from the NTSB is mostly to ‘start the ball rolling‘, and does not come to any specific conclusion (or recommendations) by the Safety Board on the accident.

Those findings will come later, as the government agency goes over the on-scene data collected by five of its investigators in May, and also from vehicle logs from Tesla and the tractor trailer.

Here is the initial findings statement from the NTSB today:

NTSB Issues Preliminary Report for Williston, Florida, Highway Crash
—7/26/2016
The National Transportation Safety Board issued Tuesday its preliminary report for the investigation of a fatal May 7, 2016, highway crash on US Highway 27A, near Williston, Florida.

The preliminary report does not contain any analysis of data and does not state probable cause for the crash.

The preliminary report details the collision involving a 53-foot semitrailer in combination with a 2014 Freightliner Cascadia truck tractor and a 2015 Tesla Model S. The report states that according to system performance data downloaded from the car, the indicated vehicle speed was 74 mph just prior to impact, and the posted speed limit was 65 mph.

The car’s system performance data also revealed the driver was using the advanced driver assistance features Traffic-Aware Cruise Control and Autosteer lane keeping assistance. The car was also equipped with automatic emergency braking that is designed to automatically apply the brakes to reduce the severity of or assist in avoiding frontal collisions.

A team of five NTSB investigators traveled to Williston to conduct the on-scene phase of the investigation. The team used three-dimensional laser scanning technology to document the crash location, the damaged trailer and the damaged car. NTSB investigators continue to collect and analyze performance data from the car’s multiple electronic systems. This data along with other information collected during the on-scene phase of the investigation will be used to evaluate the crash events.

All aspects of the crash remain under investigation. While no timeline has been established, final reports are generally published 12 months after the release of a preliminary report.

Categories: Crashed EVs, Tesla

Tags:

Leave a Reply

48 Comments on "NTSB Preliminary Report On Fatal Autopilot Crash: Model S Travelling At 74 MPH"

newest oldest most voted

The idea that this is a ‘special’ case is just stupid, and only getting media ‘attention’ because it’s a Tesla.

This is simply a driver speeding and not paying attention and crashing into a tractor trailer who was making an illegal/unsafe left turn across traffic.

Accident caused by the tractor trailer driver. If the Model S driver was paying attention, he could have avoided it. But he wasn’t, and he didn’t.

Whether the driver of the Model S was using an elaborate cruise control/AutoPilot or not, really does not matter.

It matters if autosteer was the **reason** behind the driver not paying attention. It’s a human factors issue. And it does merit special attention from the NTSB as it will help frame the deployment of driver assist technologies such as AEB as well as autonomous features such as autosteer.

That’s not a valid reason. Auto steer is not full autonomy. It CLEARLY needs driver attention and assistance.

Agree the driver is ultimately responsible

I think What doggyworld is trying to say is autopilot may not fully take “humans factors” into account. Such as placing too much trust in the system with familiarity. Or, its driver alert tests and feedback requirements allow for a driver to be less alert than would otherwise be the case.

Tesla understand human factors more than most if this is a guide possibly not fully. The fact that there is more to do has been suggested by reasoning behind the beta mode as being more to do with reducing a owners level of complacency.

The 74mph whilst autopilot engaged is problematic. Whilst some margin +5% might would be tolerated.

Tesla has done an over-the-air update to all Model S’s with autopilot to limit the speed to just 5 mph over the limit in conditions like this.

This may actually be too low, as multiple studies have proven over and over that vehicles traveling at or below the speed limit are a hazard on roads that are commonly driven by vehicles at 5-9+ MPH over the limit.

Allowing 9 MPH over in a 65 MPH+ divided highway at the DRIVER’S DISCRETION is not at all unreasonable or in itself unsafe.

If you believe that traffic should drive at the speed limit, that would be an unrealistic expectation anywhere in the world.

Nix said:

“Tesla has done an over-the-air update to all Model S’s with autopilot to limit the speed to just 5 mph over the limit in conditions like this.”

Unfortunately, not in conditions like this. The updated limit to no more than 5 MPH over the speed limit applies only to driving with AutoSteer enabled on non-divided roads. The accident in question occurred where the Model S was driving on a divided road, but one with cross-traffic intersections, and one such intersection is exactly where the accident occurred. As I understand it, the tractor-trailer was entering from a side road.

The 85th percentile is probably 69-70 on this road.

74? Your speed will begin associate you with the dangerous driver reqular infringement cohort.

The difference in response time between those two speeds at the distance reported is less than a second.

The legal issue of responsibility is not the point. The point is how people tend to behave woth a system that does most of the work for them. Ultimately it comes down to psychology.

By that same argument we should also reject anti-lock brakes and traction control. These sort of driving aids may lure the driver into a false sense of security and thus lead him/her to adopt riskier driving.

Dumb, isn’t it?

The only valid reason to reject Autopilot is if it results in more accidents or fatalities on certain types of roads than without Autopilot.

However, I don’t think anyone is making that argument–at least not yet. The commenter is merely speculating in ways to make Autopilot better. If lying to users about whether they bought antilock brakes were even possible, and it was a net improvent in safety, it should be considered.

I just recently read a comment at insideevs where someone stated that Mercedes C-Class miles traveled was 20 times safer (not sure about the exact number) than the Tesla in Autopilot. I think it was 0,07 or 0,06 deaths / (4?) million miles. You can search for death rates per model at the NTSHA statistics yourself (or how this agency is named).

Tesla at the moment compares Autopilot miles versus average car death rates. Since the average car is way unsafer to begin with the comparison is unfair.

You can view statistics here: http://www.iihs.org/iihs/topics/driver-death-rates Note that this is just driver statistics and the death rates improved a lot in last decades, especially for bigger luxury cars and SUVs. It isn’t comparable to average 11 year clunker death rate that includes passengers, motorcyclists and pedestrians, that Musk likes to use to point that somebody needs “do bloody the math”. Obviously he didn’t done it himself. You wont find Tesla Model S in 2011 model year statistics. But you can convert these “registered vehicle years” that IIHS uses to miles with some approximation, and figure out Model S total cumulative miles – it was announced 2 bln miles this year but I don’t know if it is worldwide or US only: http://cleantechnica.com/2016/04/12/tesla-fleet-racking-miles-without-autopilot/ Then you can compare various models, and Model S with 5 or so driver fatalities in the US isn’t even close to top safest cars, or average luxury cars or SUVs. Top ones don’t have fatalities at all. Even regular BMW 328 2011 has several times better driver fatality rate as it had more miles or registration years. Model S is more like average econobox from 2011 even if it is much heavier and has larger crumple zones.… Read more »

Put a steel spike where your airbag should be. Let’s see how safe one drives.

Human Risk compensation is a factor in all vehicle systems and road designs.

abs is an interesting case it seems too many are not applying the brakes hard enough so now we have brake assist
and auto emergency brake.

ABS and traction control stay in the background and only step in during extreme situations when loss of control is imminent. AEB and lane departure systems also stay in the background until needed.

Autosteer has control during **normal** driving. That’s a far cry from these other systems. As such it operates under different rules.

Carmakers are forced to design systems that are nearly foolproof. Foolproofing is why I can’t shift my minivan into gear unless I press the brake. Autosteer is far from foolproof. It’s more like fool-encouraging.

This article has some insights:
http://trueviralnews.com/?p=281752

Meh. So does cruise control. Yet nobody is trying to ban cruise control even though there absolutely has been accidents where cruise control was on when the collision happened.

Your whole argument comes down to “it’s new, people are stupid, so let’s beat it with a stick”.

Instead of beating things with a stick because they are new, take the time to actually learn how to use it.

Because frankly, having cars at all is just a lazy person’s way of not walking everywhere they go, and cars are just taking people’s attention away from walking. So if you are going to make such absurd reductionist arguments, you are stuck safely walking everywhere so that you don’t kill anybody with your shoes.

Do you walk everywhere?

The difference, to me, is that cruise control maintains a fixed setting. On that note, so does an airplane’s autopilot (well, on the basic airplanes anyway that private pilots would normally use).

Conversely, Tesla’s autopilot is reacting to the environment. It is steering and braking and adjusting speed in response to items it identifies. That is a different level of automation not commensurate with simple cruise control.

Stimpacker said: “By that same argument we should also reject anti-lock brakes and traction control. These sort of driving aids may lure the driver into a false sense of security and thus lead him/her to adopt riskier driving.” Not really the same, in my opinion. Anti-lock brakes actually help fight against the natural human reflex, when the driver sees an impending accident, to jam on the brake pedal hard, and not let up, rather than rapidly pumping the brakes, which is what you should do if you don’t have anti-lock brakes. Contrariwise, as many have pointed out, AutoSteer tends to lull all too many people, perhaps the average driver, into thinking they don’t need to pay attention to the road. I’m not saying that makes Tesla liable. But there seems to be a lot of denial here, including denial on the part of Tesla Motors, about the reality of what the average person is going to tend to do under those circumstances. Tesla Motors has pointed out that in an airplane, an autopilot isn’t intended to let the pilot “set it and forget it”. The pilot is required to remain alert and, to coin a phrase, “keep watching the skies”.… Read more »

That would not have changed this accident.

It matters because the DVD player with Harry Potter playing was the reason not the autopilot.

Why are you repeating a lie, MDEV?

At the time emergency services arrived a DvD playing Harry Potter was still playing the movie. Look at the reports from the witness in previous posts here.

Well said

Please read about accident details, some people made very details analysis. Visibility was around 1200 feet because of a hill, you can check it on maps online. At 74 mph it is 11 seconds. Loaded semi may take some 20 seconds to reach 15 mph depending tractor power. Most likely the truck driver didn’t have any chance to see Tesla behind the hill when he started turning until he was across the road. Yet 11 seconds is more than plenty of time to notice truck across the road and reduce speed for anybody driving himself, however distracted he or she may be. You may put blame on Tesla driver as he is diseased and less like to defend himself, but just read all the chorus of Tesla fanboys posting everywhere how great is Autopilot and how it allows it relax on long road trip, don’t pay attention as it is almost autonomous driving, drive without hands, sleep or play pokemon and so on. Musk himself reposted hands free driving video from diseased person. Now you may claim that some legalese checkbox invalidates all this advertising nonsense about adoptive cruise control being autonomous driving. No, it doesn’t work this way in… Read more »

zzzzzzzzzz said:

“…diseased …diseased…”

Really? I see no indication the driver was ill. To quote Inigo Montoya from “The Princess Bride”: “I do not think that word means what you think it means.”

I think the word you’re flailing around for, in your faintly amusing, overlong, and rather unsuccessful attempt to bash Tesla here, is “deceased”. Since you’re having trouble with that word, perhaps you should stick to words with fewer syllables, such as “dead”.

Your calculation says that it would have taken the Tesla 12 seconds at 65 mph to close that distance. Also too short.

Going 74 mph in a 65 mph zone is hardly unusual. But, autopilot shouldn’t allow that.

Since version 7.1, Tesla’s Autopilot allows unrestricted use (90 mph maximum speed) on limited-access divided highways, and limits Autopilot to 5 mph over the speed limit on all other roads.

Autopilot should have limited the speed to 5 mph over the speed limit since it did NOT meet the criteria set out by Tesla regarding unrestricted use of Autopilot. This road was NOT a limited-access divided highway on which Tesla recommended Autopilot be used. It was a semi-divided road that has cross streets with no traffic lights, residential commercial driveways leading directly to the roadway, and turning cross traffic from the opposite direction.

sven said:

“Tesla’s Autopilot allows unrestricted use (90 mph maximum speed) on limited-access divided highways, and limits Autopilot to 5 mph over the speed limit on all other roads.”

Twice wrong, sven.

1. The restriction is on AutoSteer, not Autopilot.

2. The restriction allows AutoSteer to be used without speed restriction* on all non-divided roads, including the one where this accident occurred. The “no more than 5 MPH over the speed limit” applies to use on non-divided roads.

One can certainly argue that the restriction should be for all but limited access freeways; with that limitation, perhaps this accident would have been avoided. Contrariwise, perhaps if Tesla limited use to only limited access freeways, other accidents might have happened which AutoSteer has prevented.

Unfortunately, we don’t have the data to know whether you’re safer to use AutoSteer under these conditions, or not to use it.

*You say there’s a limit of 90 MPH when using AutoSteer even on limited access freeways? Well, I’ve seen a lot of argument over that point on the Tesla Motors Club forum. Maybe, and maybe not.

Here are some pics of the Model S post crash and the tractor trail post crash with the impact marks of the collision.

Remarkably, the headrests on the front seats look intact, suggesting that the driver might have survived the crash had he have ducked.

You can’t duck if you’re not looking at the road, or actively engaged in driving.

Or if the driver suffered a medical condition and was incapacitated at the time of the collision.

It also looks like the driver, who was traveling in the right hand lane at the time of impact, could have swerved around the back of the trailer to avoid it since there was plenty of blacktop at that intersection between the north and southbound lanes.

Satellite pic:
https://www.google.ca/maps/place/BP/@29.4107608,-82.5400338,108m/data=!3m1!1e3!4m5!3m4!1s0x88e89296270649b3:0x9a07ea5cf961aac3!8m2!3d29.41037!4d-82.540018

Hey Jay, just a note: “Travelling” should be “Traveling”

Actually, I guess it depends on your target audience…
http://writingexplained.org/travelling-or-traveling-difference

I learned something new today! 😉

Please learn to speak the Queen’s English correctly, travelling is correct.

according to system performance data downloaded from the car, the indicated vehicle speed was 74 mph just prior to impact…”

And this, gentle readers, is a perfect example of why we should not regard eyewitness reports as reliable, and why we should wait for more solid information, when available, before we make up our mind.

As I previously pointed out in comments on the same subject: The woman who reported a car whizzing past here may well have been correct to say that car was doing at least 85 MPH. But did she correctly identify the wrecked Model S as that speeding car? My guess is “No”.

Still, 9 MPH over the speed limit is not what I’d call safe driving, and may well be a significant contributing factor to the accident.

Do the math on what 9mph in a 65mph zone changes.

Very, very little.

This is an expected error of radar cruise control, it can not handle crossing traffic. It is definitely a driver fault problem, but speeding had little to do with the accident. The driver was in conditions where autonomous driving does not yet work.

Perhaps better use of mapping software and the camera and gps can help prevent future accidents like this, but the driver is ultimtely responsable. It may also take a better sensor package with laser sensors as google uses in their self driving experiment.

Interesting that the car was going 74, exactly 9 mph over the speed limit. The police in my area (Maryland DC suburbs) won’t pull you over unless you are 10+ mph over. Has a 9mph threshold been programmed in Autopilot? One of my concerns about autonomous driving is that the cars would be programmed to go only speed limit, which hardly anyone does. They’re going to be the slowest cars on the road.

AutoSteer (part of AutoPilot suite) limits you to 5 mph over on non-divided highways. This was a divided highway so the driver could go up to 89-90 mph.

In the ‘driver assistance’ tab in the car you can set the speed limit to be +/- X over the speed limit. For example, mine is set to +5 so when I enable Traffic Aware (adaptive) Cruise Control it sets my speed to 70 mph on a 65 mph road (based on last sign read).

As long as they’re polite, I don’t care how fast they go. Google cars yield.

Wow. A whole 9 miles over the speed limit. Clearly, he was a reckless individual with wanton disregard for public safety, like 80% of drivers out there.

For the new folks come to this article please note:

*every*time* you/driver enable AutoSteer/Pilot the car reminds you to keep your hands on the wheel. *every*time* — it is your freewill choice to ignore that like your choice to use your phone when you are driving.

When you/driver enable AutoSteer/Pilot for the first time (during orientation), you/driver have to agree to keep your hands on the wheel and you/driver are responsible. See this screen capture from my Tesla. The text is clear.

A wall of text disclaimer and a dead-man stick isn’t enough.
People skip walls of text and press yes without reading, just like they skip the wall of text end-user licence agreements when installing software on their computers and just press “next”.

The current hands on wheel detection scheme isn’t even a good dead man stick as the driver could perfectly faint and fall on the wheel, the car wouldn’t know the difference.

Just “keeping your hands on the wheel” (dead man stick) does not mean the driver is paying attention it just checks the person is alive and remains at the wheel : you can perfectly leave one hand on the wheel and close your eyes.

Not only that, but the current implementation allows the driver a ridiculous amount of time not even holding the wheel before even sending the first warning.

The current implementation of driver involvement check in Tesla’s cars is just useless.
Essentially, the only thing that currently ensures the driver is paying attention in Tesla cars is the driver’s mistrust in the machine.
AS soon as drivers get comfortable with it (like was the case with Joshua Brown), all bets are off.

So you want it to be so aggressive and annoying that a driver would simply not use it. I see your point that if it is not used then it can not cause any problems.

You want it to track peoples eyes to make sure they are scanning the road every few seconds, you want it to detect the slightly release of pressure on the steering wheel and flash red on the heads up display and set off an alarm. Got it.

People using phones today are a much bigger concern and there are ZERO warnings to see they are paying attention even with their hand on the wheel.

The driver is responsible to maintain control of the car. It is that simple.

GM’s upcoming Super Cruise sounds much more advanced than Tesla’s current Autopilot:

http://www.freep.com/story/money/cars/general-motors/2016/07/22/self-driving-super-cruise/87444308/

Eye retina movement detection software that can tell whether or not the driver is paying attention to the road (or if the are dozing off), and activation of Super Cruise geo-fenced and limited to highways pre-mapped by LIDAR.

And on a related note, more douchebaggery by a Tesla owner playing Pokemon Go while Autopilot was activated (*facepalm*):
https://www.youtube.com/watch?v=h4rXYLR6OtQ

There are dead people bc they were playing Pokemon while driving, unfortunately for them they were driving other cars. Tesla is not stupid idiot proof but if I have to share the route with this morons, I wish they drive a Tesla.