Tesla Autopilot Fatality: Report From The Scene, Witness Account, Official Police Report – Video

JUL 1 2016 BY JAY COLE 92


Yesterday brought news of a NHTSA (U.S. National Highway Traffic Safety Administration) preliminary investigation into a Model S traffic fatality that occurred while Autopilot was engaged; of which, full details and a Tesla statement on the crash can be found here.

Scene Of May 7th Tesla Model S Fatality (via ABC Action News/Bobby Vankavelaar)

Scene Of May 7th Tesla Model S Fatality (via ABC Action News/Bobby Vankavelaar)

Now ABC News is reporting from the scene, detailing the accident as well as showing some live footage from after the accident.

Update (July 2nd): Police diagram of accident and official report details added below

The report also re-states some information that has come to light after the fact, not found in the original Police report, such as that the Model S driver may have been watching a movie at the time of the accident, and some speed data.

It also twists some facts in the case the way only a local news crew can – “Right now, questions be asked like: “How did a Tesla on Autopilot slam into a tractor-trailer?  Why didn’t the built in safety system stop it?…And do these cars threaten others on the road?”

Pretty sensationalistic to say the least.  Note how ABC says questions are being asked…but necessarily by them.

Update (Thursday, July 7th):  Florida Highway Patrol said Thursday that both a computer (laptop) and a DVD player were confirmed in the vehicle, but neither were found running after the crash.  Investigators on the scence could not determine whether the driver was operating either of the two at the time of the accident.

As we know, the Tesla did not set-off the accident, nor was its operation at the time the cause…so we’d suggest just watching the report to better understand the scene and to hear the witness account – including a hearsay report of the Model S travelling over 85 mph just before the accident.

Tesla Crash Police Diagram – Via LA Times

Full crash report documents can also be found here.

LA Times, hat tip to sven!

Categories: Tesla

Tags:

Leave a Reply

92 Comments on "Tesla Autopilot Fatality: Report From The Scene, Witness Account, Official Police Report – Video"

newest oldest most voted

“the Model S driver may have been watching a movie at the time of the accident.”

Apparently the movie was Harry Potter. Stupid Muggle.

So if the car didn’t have auto pilot would this have happened? Probably not as the driver wouldn’t have been watching a movie.

I can see the slogan now. Tessa’s don’t kill people, people kill people.

This is definitely a sad event. I genuinely question whether people are smart enough to use “auto pilot” as it is meant to be used and not as some have used it and as it appears they aren’t if it should be shelved until it can actually be safely autonomous which I bet is coming quickly.

We have video of idiots falling asleep at the wheel, hopping to the back seat, etc.. No person would do that with a car that doesn’t have “auto pilot”.

Stupid people using things stupidly shouldn’t be used as an excuse to restrict access to things that many more people use responsibly- and of those- are not being forced to use it.

99.9% of Teslas on autopilot that day hurt no one, and it’s can be demonstrated that drivers who are aided by an AutoPilot equipped Tesla are at the very least safer than the national average of all cars.

Yes it should because it affects other people. You are not alone on the road.

Yep, this is key.

While principle is very good, Tesla boast better then average safety with auto pilot on.

Question is then why not everybody have them 😉

Then both cars would be able to acknowledge their presence and routes and react without drivers conscious efforts.

We should also take all guns away as it effects other people.

We should also take all vehicles off the road as it effects others.

Agree ban technology after assault weapons get banned.

No you shouldn’t, because it’s the driver’s responsibility at all times, to be in control. Taking autopilot away from responsible drivers because irresponsible ones use it? That’s the kind of broad conclusion only Americans jump to.

this accident shows some of the unintended dangers of the auto pilot feature. the driver of the tesla apparently got “jayne mansfield”-ed, so to speak, but the car kept on driving. somehow it got off the road and on to someone’s property. then the “lane” that the auto pilot followed apparently became the space between trees until it crashed head-on into a power pole. that suggests that the auto pilot apparently didn’t recognize the pole and slow the car down to a stop. this accident raises all kinds of questions about how well this system actually works, and about how good an idea is it to perform unregulated “beta” testing on public roads.

i guarantee that the tesla legal department is working overtime right about now because they are looking at some potentially significant legal liability.

The car got its top ripped off by driving high speed under the tractor trailer which killed the driver and pretty certainly disabled auto pilot. After that it was inertia that had it rolling offroad and hitting other things, not auto pilot.

obviously someone is posting comments here without having watched the video report that accompanied this article.

just out of curiosity, if disabling auto pilot requires action by the driver, but the driver is disabled, how is auto pilot disengaged?

DJ said: “I genuinely question whether people are smart enough to use ‘auto pilot’ as it is meant to be used and not as some have used it and as it appears they aren’t if it should be shelved until it can actually be safely autonomous which I bet is coming quickly.” I don’t merely question whether some people are smart enough to use motor vehicles properly. Some of them have shown by their actions that they aren’t. Does that mean we should ban motor vehicles? No, it means we should demand that people use them responsibly. People texting on cell phones while driving doesn’t mean we should ban cell phones. It only means we should demand people who are operating motor vehicles drive them responsibly… which means not using the effing cell phone while driving! Duh. Similarly, people taking their attention off the road when using Tesla AutoSteer doesn’t mean we should ban AutoSteer. It means we should demand that people use it responsibly. And personally, I would definitely not “bet” that fully autonomous Tesla cars are coming “quickly”. That’s at least a few years off, and probably at least a decade. Speaking as a computer programmer, developing a… Read more »

What can you say about the thousands of kill people because driving and texting? Should we ban cellphones ? The world has irresponsible people and other not so smart, but the fear of advance in technology can not be stopped. We humans tried before it was called the Dark Ages where the non smart people ruled the world for centuries. Please somebody knows how many potential accidents autopilot has prevented?

we shouldn’t ban cell phones, or guns, or cars with auto driving features… I know exactly what we should do, we should ban idiots! 🙂

I understand that this car was traveling at a very high rate of speed as well as watching a video…nothing is full proof,I would never put full trust in a new technology especially when lives are at stake. Tesla clearly warns of this , there will be glitches . The driver is always held Responsible . The driver should be Vigilant & ready to take control At all times…I guess we People Put too much Trust in new technology & become too complacent..The truck driver didn’t do any one any favors by doing what he did either..I believe the truck driver will be found at Fault and someone has paid for that with their life…

Your “understanding” of DVD use and speed is built on speculation and hearsay. Although we may never know whether the guy was watching a movie, we will know the exact speed. Wait before making judgments.

i thought the same thing that you did. how many idiotic tesla “prank” videos do you see being posted on insideevs? and there are probably even more videos that are at least as idiotic that don’t get picked up on insideevs.

that is not to say that every tesla driver is stupid; but some really stupid people seem to be attracted to buying tesla cars and then stupid people do stupid things.

these “self driving” features are being released without regulation. the lack of regulation poses a hazard to the general public. regulations aren’t only to prevent stupid people from doing stupid things; but they are to prevent other people from getting harmed when those stupid people do stupid things. in this case, fortunately, the tesla hit a large truck, and the driver wasn’t physically harmed, although i am sure that his truck sustained damage.

How odd that you seem to be suggesting that the truck driver was the potential victim here.

Reportedly at least, the truck driver was charged with failing to yield right of way as a result of this accident. Assuming that’s true, then the truck making an illegal left turn across the path of the Model S was the actual cause of the accident. The fact that Tesla AutoPilot failed to prevent the accident in no way alters the fact that the truck driver is legally responsible for the accident.

I find it truly bizarre that people are jumping to the conclusion that if a car has any self-driving features, then it should be able to prevent all accidents. That’s like suggesting that as soon as Ford started making the Model T, that every car on the road should have been expected to be able to drive at 75+ MPH, all day long for weeks and weeks, without even once breaking down.

No, we have a long way to go before we have fully capable self-driving cars which can be expected to handle every condition encountered on the road. It’s still very early days in developing that technology.

ok, here’s how it works…why you are operating a motor vehicle, YOU are the one who is responsible for the operation of the vehicle. what we have here is a guy, driving at a speed that was reportedly in excess of 85 mph, and the guy is watching a video. i don’t know what you call that, but i call it STUPID.

this is why self driving features have to be regulated. you can’t have just anybody acting as a beta tester for this kind of stuff. some of these tesla drivers have more dollars than they have sense. youtube videos uploaded by tesla drivers clearly underscores the need to restrict *who* is using this feature at the present time: videos of guys “falling asleep” while “driving”; videos of guys with phony seat covers to make it look like there is nobody in the car.

some of you people posting here think that these pranks are “cool”. not trying to be too “adult” here, but personally, i think that these are “jackass” stunts.

There is a witness saying the Tesla pass her(him?)at high speed before the accident.
Something to look at for investigators.

Again, the speed at the time of the collision will be confirmed by Tesla’s data. There is no need to speculate on what another driver claims to have seen sometime prior to the accident.

So the crashed Model S was doing in excess of 85 mph? (Passed another car that was doing 85) …..

That would explain some of it.

Speculation as to the speed of the Tesla is unnecessary since Tesla has provided that information to the investigators. Until that data is released, there is little use in speculating as to the speed.

From the video: Reporter: “Vankavelaar says witnesses stopped to talk to investigators when they saw the wrecked car in his yard.” Vanakvelaar: “She was passed and she said she was doing 85. And when this car just passed her she was like wow, you know, I wonder how fast that car is going.” Apparently, the Model S was going a good clip faster that 85 mph before the crash. The Tesla data logs will show the exact speed. If the Tesla was going in excess of 90 mph, then the I can see how the semi truck driver didn’t see the Tesla coming or misjudged whether he could make it across the intersection in time. The wide grass median makes this a difficult turn for a semi truck to make because of the sightlines from the cab of the truck. Since the semi truck driver is perpendicular to oncoming traffic when checking for oncoming traffic, he had to look through his passenger door window about 8 feet away with sideview mirrors partially blocking his view, and through the right hand corner of the front windshield with the A-pillar partially blocking his view. https://goo.gl/maps/Mqgai6sVMxL2 There also should have been a blinking… Read more »

Agreed sven. Personally, I will not buy another vehicle that does not have autonomous drive hardware. However, one should not forget that Tesla Motors is at best running an advanced level 2. Full autonomous drive does not show up until level 4 as outlined here by NHTSA http://www.nhtsa.gov/About+NHTSA/Press+Releases/U.S.+Department+of+Transportation+Releases+Policy+on+Automated+Vehicle+Development

I applaud all that are advancing autonomous drive by racking up miles in level 2. Choosing to drive 90 mph and watch a movie on level 2 autonomous drive is beyond careless.

Still a huge fan of this technology, while hopefully this is a wake-up call to those abusing Level 2.

If the car was traveling at an abnormally high speed, that would definitely be a contributing factor in the accident, as the truck driver may have expected the car to be traveling at a normal speed, giving him time to cross the road.

And should the auto-pilot even allow the car to be self-driving at *illegal* speeds? AFAIK, 85+ mph is not normally a legal speed most places in the US. Kinda seems like a legit question.

Then again, if the driver chose that speed, then that was his responsibility. *He* ignored the speed limit and was acting dangerously, not the car.

It appears there is limited visibility of oncoming traffic at that intersection. There is a small hill crest not far from the intersection where the semi was making the left turn. The Google Maps camera is elevated like the view from the semi truck’s cab, and from that raised vantage point it appears you can’t really see approaching cars on the other side of the hill crest.

https://www.google.ca/maps/@29.4107983,-82.539466,3a,75y,259.33h,67.02t/data=!3m6!1e1!3m4!1sEtzrzjyU6DZuMK2aIY1EFQ!2e0!7i13312!8i6656

If the Tesla was traveling much faster than could anticipate than it caused the accident not just contributed to accident.
It is nearly impossible to estimate speed of the car that comes at you head on. The truck driver had the right to assume a normal (or nearly normal) speed for the conditions (this is not the same speed as might be allowed by the traffic code – it could be faster or slower).
Tesla should have designed the system for the stupid user – like some other car makers. It is obligation of Tesla to do that – but who would tell it to the billionaire. I would – but I am not working for him.

that’s fine, you’re a gadgeteer. being fascinated with the latest gadget is ok when you are in a laboratory. but this self driving beta test experimentation is being done in the public and the rest of us are being put at risk by the fascination of a few enthusiasts with the latest toys.

i DO NOT applaud the unregulated beta testing of the self driving feature. in this instance, the driver got killed, but the self driving feature didn’t know it, so the car kept on driving.

at this point in time, the use of self driving MUST be regulated. it can’t be made available to *anybody* who plunks down a few dollars because there are a lot of tesla drivers who apparently have more money than sense.

Tesla Autopilot disengages if you are going faster than 88 MPH, so it either wasn’t on Autopilot, or it wasn’t going faster than 88 MPH.

In Sweden Tesla Autopilot can drive at 150km/h (about 93 mph).

Phil Trubey said:

“Tesla Autopilot disengages if you are going faster than 88 MPH…”

Are you speaking from experience? If not, perhaps you have misunderstood what you’ve read.

I see comments on the Tesla Motors Club forum that that ACC (Active Cruise Control)’s blind spot warning may be turned off at 88 MPH, but nothing indicating either AutoPilot or AutoSteer have any maximum speed on a divided highway, plus this accident occurred on a divided highway.

Tesla has recently upgraded AutoPilot to limit speed to no more than 5 MPH above the posted speed limit on undivided roads… which again doesn’t apply in this case. Even if it did, we would need to know if the Tesla owner in question had upgraded his car’s software to the latest release. The car owner can choose to upgrade, or not.

On my X, at least, the setting won’t go above 90 mph.

I’m not interested in the imagined capabilities of an imaginary Tesla car owned by a perpetual Tesla basher.

88? Really? This was not the place to have a movie reference.

I think the autopilot should not operate over 65mph.

Are you also advocating that the cruise control in all vehicles should be limited to 65 mph or is just special treatment for Tesla?

Dude – There is a big difference between cruise control and Auto pilot. Spec stated autopilot should be governed, not cruise control.

Well of course we don’t know just how fast the Model S was going at the time of the accident, and eyewitness reports are notoriously unreliable.

However, the speed at the time would certainly be a factor in Autopilot’s ability to detect a vehicle on an intersecting path, and react in time. The faster the car is going, the shorter will be the time that the car can react. Software runs at lightning speed, but mechanical systems such as servo motors and automobile brakes… do not.

Pmi-Pyu said:
“eyewitness reports are notoriously unreliable”

Meh. In this situation, the eyewitness report seems more reliable than eyewitness reports in general, since it was so easy for her to verify how fast she was going by looking down at the car’s speedometer. The first thing I do when a car blows by me on the highway at an excessive speed it look down at my speedometer to see how fast I was going, then I estimate how much faster the speeding car was going. However the most reliable evidence will be the Tesla data logs, which will tell investigators the speed the Model S traveling at impact and prior to the accident.

sven said:

“In this situation, the eyewitness report seems more reliable than eyewitness reports in general, since it was so easy for her to verify how fast she was going by looking down at the car’s speedometer.”

Certainly.

What is far less likely to be so reliable is the eyewitness’s ability to identify just which car it was that passed her at such a high speed. Most of us regularly looking at InsideEVs photos may be able to recognize a Tesla Model S instantly and reliably. But the average driver on the road? Probably not so much.

BTW — sven, I appreciate your contributions to this story, and your thoughtful comments in this discussion. It’s a nice throwback to the “old” sven who used to post here.

Many lorry drivers assume you’ll give way to them because THEY are bigger. Occasionally it backfires…

What is the speed limit for this road?

In January, Tesla updated the autopilot driving systems in Model S sedans to put new limits on its hands-free operation, which has been both praised for its innovation while criticised for having been launched too early.

The function will now be restricted on residential roads or roads without a centre divider, meaning the car cannot drive faster than the speed limit maximum plus 8 kilometres per hour.

65 mph speed limit is way too high for a section with cross traffic. The state is partial responsible for the tragic.

Every highway that’s not a fully limited-access highway is going to occasionally have cross traffic. Are you really suggesting that only fully limited-access highways should be allowed to have a speed limit of 65 or faster?

If so… then you must never have lived in any rural area, or any area where it takes many hours to drive between major cities.

Maximum +10, but that’s only on restricted roads… Most roads don’t have a limit (other than the 150kph / 93 mph).

I blame Voldemort.

Accidents are going to happen regardless the advancement of the technology or until everyone has similar technologies in their cars to avoid them. The deceased in this incident paid the price for his personal over confidence. Luckily he didn’t take anybody else with him.

The details doesn’t matter. The mainstream media is very anti-Tesla (maybe coz they don’t advertise). Like ABC, Wall Street Journal is slamming Tesla really bad. Saying crap like it’s not regulated. Also slamming VW Dieselgate payout, calling it $2 Billion to Tesla.

There’ll be no end to this, while countless others die everyday in self-induced driver accidents.

> “The mainstream media is very anti-Tesla…”

Tell that to this stooge: http://www.foxbusiness.com/features/2016/06/30/why-media-loves-tesla-and-hates-apple.html

Meanwhile, elsewhere on Fox.

Did you read the link you sent? “masters of hyperbole like Elon Musk”

The mainstream media IS very anti-Tesla. NY Times, LA Times, FOX, ABC and the king of them all, WSJ.

WSJ also denies climate change – “cannot separate signal from noise”.

+1000
Oil corporate influence is omnipresent in the mainstream media.

What do you mean? Does ANY sane person even QUESTION whether Elon is a master of hyperbole?

I am a HUGE fan of Musk, but even I can see that he is indeed a master of, and a willing user of, hyperbole.

Mainstream media is crap in a number of ways. Being too skeptical, towards Tesla or anyone else, isn’t one of the problems.

Hyperbole is only hyperbole when in the end it is way out of the truth. The truth is, however, that, even when at the time of talking Musk seems to say things that are impossible or hard to believe, they come true in the end. “A start-up going to built the worlds best car? What a hyperbole!” But it happened.

I noticed that one of the tactics used by professional disinformers in order to get more credibility, is to claim their faith in whatever they demolish at the same time.

Terawatt said:

“I am a HUGE fan of Musk…”

Your posts seem to indicate otherwise. Indicate otherwise quite strongly.

Just my 3 thoughts regarding this accident: 1. This wouldn’t have happened in Europe where it’s illegal for large trucks to not have protective bars on side and in the back of the truck, so a car can never get underneath the truck, not sure why this doesn’t work the same in the US? Truck lobby? Wonder how many people die like this each year? 2. I don’t like how Elon works with statistics. He’s always talking average for the whole population, but that includes all the idiots driving recklessly. What I would like to know is how many deadly accidents are caused by 35-45 year old college educated soccer moms driving a car that’s up to 5 years old. That’s the statistic that should be mentioned when making comparisons with autopilot. Plus Elon doesn’t mention all the instances when autopilot had to be switched off by the driver to avoid an accident. 3. It’s so sad to see this guy die, he should have taken the hint that although autopilot saved him from that truck a few weeks ago, that similar unexpected things do happen on the road and autopilot is simply not ready for all of them yet.
Boris said: “I don’t like how Elon works with statistics. He’s always talking average for the whole population, but that includes all the idiots driving recklessly.” It’s not only fair to include “idiots driving recklessly”, it would be highly improper (and logically fallacious) to exclude that very important segment of the driving population. Actuarial tables, mortality rates, and monthly insurance rates all must take into account the small segment of the population which drives recklessly on a regular basis. Now, where Elon should be criticized is for comparing the general accident rate under all conditions to the accident rate when using AutoSteer, since AutoSteer can’t be engaged under more difficult driving conditions. AutoSteer was meant to be used only on divided roadways, where the accident rate is lower. However, it’s very clear from many videos and comments posted to Internet forums (such as the Tesla Motors Club) that many Model S and X drivers are using AutoSteer on non-divided roads, and in fact many are foolishly using it everywhere it will allow itself to be engaged. So the question of just which accident rates should properly be included, and which excluded, would be a thorny problem for a statistician. “What… Read more »

Even if the truck had those protective bars, would an impact at 85+ be survivable?

In a Model S you very well might survive. The recent spectacular crash in Germany is just one example. The only other fatal Model S accident of which I am aware was in Vancouver this winter when a Tesla driver hit a dump truck head on.

That road is dangerous. I could see this accident happen with or without AP. Think another semi fully loaded-> would they be able to do an emergency break and avoid collision?

I find it pretty sensational that InsideEVs doesn’t think it pertinent to ask why the automatic emergency braking didn’t kick in. This system is supposed to help avoid accidents whether autopilot is engaged or not.

The fact that the car drove on after crashing is quite disconcerting. While the sensory stuff required to “see” obstacles that one might crash with is very tricky, I would have expected the car to be able to detect the fact that it crashed when it did so!

Of course autopilot needs only to be better than humans in order to be a rational choice. But this does illustrate the kinds of problems that come with autonomous vehicles. If an accident like this happens with a self-driving car (and not some “beta” thing where the driver is still in charge, legally if nothing else), is someone liable? If so, what is the appropriate punishment/damages?

Terawatt said:

“Of course autopilot needs only to be better than humans in order to be a rational choice.”

I find it rather strange that you say this, since the rest of your post completely ignores that fundamental truth.

“I find it pretty sensational that InsideEVs doesn’t think it pertinent to ask why the automatic emergency braking didn’t kick in.”

In general when reading anything, including news stories, you are expected to engage your brain and read critically. You are not expected to act like a sponge, absorbing everything without thinking. Kinda like the fact that you should remain alert and engaged in driving even when using Tesla AutoSteer.

Of course I can’t speak for the InsideEVs staff, but perhaps they think that anyone reading articles on the subject should be asking that question without needing to have it spoon-fed to them.

I’m not quite 100% buying Elon’s explanation that the Tesla’s Autopilot emergency braking system didn’t engage because the high ride height of the trailer made the Autopilot think the trailer with empty space underneath it was an overhead road sign.

Before impact but within scanning distance, the Model S’s camera and radar should have detected the lower-riding semi tractor as it crossed across the lane in which the Model S was driving, and warned the driver and/or applied automatic emergency braking. The rear tires/wheels on the double rear axle crossed the Model S’s path right before impact, yet the Model S’s camera and radar didn’t detect them at all?

Trailers often have spare tires hanging below the middle of the trailer, unless the spares were already used and placed on the truck or trailer. The crash investigation should determine if the trailer had spare tires hanging below the trailer and how close to the point of impact. If it had spare tires at or very close to the point of impact, it would undercut the explanation of why the Tesla Autopilot didn’t detect the trailer.

Too fast, too much sunlight for an old Model without lidar, too dumb ass muggle driver, distracted and careless truck driver. What more do you need?

+1 – Yes, moving that fast over the limit, and relying solely on AP, was a fatal mistake by the driver.

I am suprised that emergency breaking did kick in that the very end (not that it would have mattered at that point, given the speed – Which I believe is triggered by the “close in” sonar sensors). But the longer distance primary camera apparently didn’t either see the white truck edges (contrast and edge detection), or decided it was an overhead sign, based on the height off the road.

The EU version of “safety bars” under the truck, would absolutely NOT stop a car a heavy as model S, moving that fast (lots of momentum). But they may have been enough for the primary camera to see the trailer as an obstacle across the road…

Only Tesla likely has those details (and the NHTSA will have that data shortly).

Not to disagree with you here, sven, but I think there’s a much more immediate, more relevant question:

Why didn’t the Model S’s software detect that the trailer on that tractor-trailer rig was a moving object? Tesla’s response about contrast between a trailer painted white with a “brightly lit sky” suggests to me that the object recognition software had trouble recognizing the trailer as an object separate from the sky. And that’s why all autonomous cars should be using scanning lidar to “see” other vehicles on the road, rather than trying to rely on stereo camera images!

cameras don’t work as well as the human eye; especially in high contrast situations. if you’ve ever tried to take a photograph in high contract conditions and seen the resulting photograph, that might have been the kind of image detected by the auto pilot sensors.

Model S is heavy. If the link to sensors was severed, at this speed, it just went on his momentum. Heck! All the top of the car was missing! Only a miracle would have kept the autopilot systems on!

the auto pilot control is not in the windshield.

The statement describes the accident as having been “caused by a semi tractor-trailer which crossed a divided highway and caused the fatal collision with Josh’s Tesla.”
No citations have been issued, but the initial accident report from the FHP indicates the truck driver “failed to yield right-of-way.”

http://uk.reuters.com/article/idUKKCN0ZH4X3

That and like I said earlier, the dangerous nature of a highway with 65 mph and crossings like that. Sadly having right of way is not a guarantee of survival especially if not paying attention to the road.

When I took Drivers’ Education, some decades ago in Kansas public schools, this was called being “dead right”.

Yep, known as something like that around here too.

if the truck driver made a left turn in front of the tesla driver it seems odd that the accident report was state a failure to yield right of way. the report also indicated that the tesla driver had made no action contributing to the accident. it’s apparently a preliminary report, though.

my understanding is that the tesla driver may have come over a hill while the truck driver was in the turn. that, combined with uncertainty about how fast the tesla was driving might have led to the failure to yield right of way statement on the accident report.

You can limit highway speeds to 50mph. Would that upset big oil? Upset the trucking lobby? The suburbanites subsidized by cheap oil to commute? Oh well, too bad.

Well, this may be the nugget of truth from the urban legend about the RV on cruise control…

http://www.snopes.com/autos/techno/cruise.asp

Nope. That urban myth far predates any car having anything similar to AutoSteer. That urban myth is a typical expression of people’s fear of advancing technology.

A few weeks before this fatality Volvo criticized Tesla for offering Level 2 autonomy but marketing it as Level 3 autonomy. The difference is that Level 3 autonomy takes care of safety critical situations. IOW Tesla has semi-autonomous technology which it presents as being autonomous.

The government needs to more clearly define the different levels. If the car is like the Tesla vehicles and can’t handle safety critical situations, then the driver should not be able to hand off the driving functions to the car. Because once you allow the driver to take themselves out of the driving loop, say by allowing them to take their hands off the steering wheel and do things like check email, there is no possibility the driver will be able to handle a crisis.

More stringent requirements about distinguishing when and what the car can do and limitations regarding the release of half baked software would also be desirable.

Sad it took at least one death to prompt an investigation.

+1 – Totally agree, I’ve read somewhere that Elon considers the model S to be a L2.5 sort of system. And with incremental software improvements – It’s moving ever closer to a L3 system.

I find is much more interesting about the apparent disagreements between MobileEye and Telsa, on whose software is “rated” for what situation, and given the newer V8 of the Telsa software that’s starting to roll out, they have added “highway ramp exit” features. So this really is a pretty “fast moving” space in terms of what the AP can/cannot do – The difference (good or bad, is left as an excersize for the reader), is that Tesla seems willing to put “known beta” software out in the wild…

https://techcrunch.com/2016/07/01/mobileye-tesla/

This is simply two drivers violating basic traffic laws: 1. Driver not paying attention to the road, watching a movie while he should be driving the car. Whether he was using an elaborate cruise control (AutoPilot) or not. The driver is still fully responsible….ALWAYS! 2. Truck driver making an illegal left turn. Left turns across opposing traffic with no traffic light or stop signs are ONLY legal when safe to do so. The trucker turned in front of opposing traffic when it was not safe, expecting the car to stop for him….and it didn’t. Even though the driver of the car was not paying attention to the road, the truck driver actually caused the accident by illegally blocking opposing traffic. Illegally watching a movie while driving only ‘delayed’ the drivers ability to respond to the illegal action of the truck. Blaming Tesla is like blaming the county for not having a stop sign or turn signal light there, or blaming the sun for shining on the side of the truck. There are more accidents with drivers with zero technology in the car. According to IIHS, in 2011 there were over 2,000 accidents where cars ran under tractor trailers to some… Read more »

^^ Best post on the issue of fault.

In states that have Contributory Negligence or Comparative Negligence, the Tesla driver would share negligence. But the primary negligence would still be with the driver of the truck.

When crossing over a lane of traffic, you must yield even to speeding vehicles.

And while the Tesla driver failed to stop in time to avoid the accident because he apparently wasn’t paying attention, it is still the duty of the truck driver NOT to violate traffic laws, putting other drivers into a situation where hard braking is the only chance to avoid the accident.

Exactly my impression, it’ looks like an unfortunate combination of factors, though it would be interesting to read the final report.

Is it possible the Tesla’s ‘beam’ probed the space in front of it that was underneath the trailer and therefore didn’t ‘see’ the truck at all?
There was a story on here recently of a Tesla that drove (parked?)into the overhanging load on a truck’s trailer because it ‘looked’ underneath it at the back of the trailer and didn’t ‘see’ the overhanging load?

Based upon this information, it sounds like the system thought it was a vehicle stopped in the fast lane, and tried to go around it in the slow lane. The driver of the truck reported that the Tesla made a lane change. If the driver wasn’t paying attention, then it must have been the Tesla that changed lanes. It looks like coming over the nearby hill, the system mis-read what the collision was it needed to avoid. Instead of reacting to a vehicle crossing the road, it mis-identified it as a stopped vehicle in the lane, and changed lanes to avoid the accident. The last question seems to be what the speed really was of the Tesla. ” Baressi, an independent owner-operator, was hauling a half-load of blueberries when the 18-wheeler he was driving made the a left turn, attempting to cross the eastbound lanes of U.S. Highway 27 Alternate near Williston, Florida. Baressi told Reuters on Friday that he had waited to allow another car to go by, then was making the turn when he first saw the Tesla. “I saw him just cresting the hill so I gave it the gas,” said Baressi, who said the Tesla was… Read more »

Normally, Tesla AutoSteer won’t cause the car to change lanes. Tesla does have what it calls “automated lane change”, but the driver has to initiate that by pressing a button on one of the steering wheel column stalks.

Will AutoSteer self-initiate a lane change in an attempt to avoid an accident? That’s not clear to me.

From the beginning it was obvious Tesla is misleading customers with the name “autopilot”. I personally think the system is better described as “lane assist”, yes it is the best lane assist system on the market but by no means a autopilot. I am a Tesla owner and an experience Electrical and Computer Engineer in designing autopilot systems for aircraft in addition to experienced automobile technology. I understand why Tesla wants to call it “autopilot” but they should recognize they still have steps to complete before they can reach autopilot status and should take a little more responsibility in properly describing the steps in addition to more appropriately naming the technology to prevent misleading those who are not technically savvy (which is most of the population).

The accident was caused by the irresponsible driver traveling at the high velocity in Tesla car.
Contributing to it was Tesla’s incompletely designed and deceptively marketed “autopilot” system.
There might be more contributing factors – including that the driver was already “absent” (in dead due to other causes) before the car struck the trailer.