Tesla Model S Fatality In AutoPilot Mode Opens NHTSA Investigation (Updates)

JUN 30 2016 BY JAY COLE 166

Tesla Model S Goes Cross Country On Autopilot - Image Credit: Alex Roy

Tesla Model S Goes Cross Country On Autopilot – Image Credit: Alex Roy

A fatal accident involving a Tesla Model S while in Autopilot mode occurred on May 7th in Williston, Florida.

Earlier Tesla Model S Meets Trailer Incident During Summon - (via KSL)

Earlier Tesla Model S Meets Trailer Incident During Summon – (via KSL)

The circumstances of which have cause the NHTSA – U.S. National Highway Traffic Safety Administration to announce today that it was opening a preliminary investigation into the ~25,000 Tesla Model S sedans equipped with the function, stating the incident “calls for an examination of the design and performance of any driving aids in use at the time of the crash.”

The investigation could lead to a recall, or retraction of the Autopilot system entirely if the agency finds the vehicles are unsafe while operating in Autopilot mode.

NHTSA reports the particulars of the accident as being the collision occurred when a tractor-trailer made a left turn in front of a Tesla at an intersection.

Update:  ABC News has filed a report, with some live videos from just after the accident, and a witness account of the incident.

Update 2:  The Associated Press reports that the driver of the truck in the accident states that the Model S driver (Joshua Brown – 40, Navy Seal) may have been watching a movie at the time of the accident (although this information was not listed in the police report), stating:

Frank Baressi, 62, the driver of the truck and owner of Okemah Express LLC, said the Tesla driver was “playing Harry Potter on the TV screen” at the time of the crash and driving so quickly that “he went so fast through my trailer I didn’t see him.”

 “It was still playing when he died and snapped a telephone pole a quarter mile down the road,” Baressi told The Associated Press in an interview from his home in Palm Harbor, Florida. He acknowledged he couldn’t see the movie, only heard it.
Update 3 (Thursday, July 7th):  Florida Highway Patrol said Thursday that both a computer (laptop) and a DVD player were confirmed in the vehicle, but neither were found running after the crash.  Investigators on the scence could not determine whether the driver was operating either of the two at the time of the accident.

Since the announcement, Tesla has issued a blog post noting it is the first fatality in over 130 million miles Autopilot has been in operation, and also gives its own account of the accident:

What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.

The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.

Tesla Crash Police Diagram – Via LA Times

Full police crash report documents can also be found here (LA Times).

We note that the fatality in this case, 40-year-old Ohio resident Joshua Brown (obit), also appears to be the same person who reported in April that the Tesla Autopilot system potentially saved his life (YouTube video of earlier Autopilot crash avoidance incident below):


Below: Tesla’s statement on the accident and investigation

A Tragic Loss

Autopilot in control of a Model S

Autopilot in control of a Model S

We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.

Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.

It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.

The customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission. We would like to extend our deepest sympathies to his family and friends.

Hat tip to Lanny H!

Categories: Tesla

Tags:

Leave a Reply

166 Comments on "Tesla Model S Fatality In AutoPilot Mode Opens NHTSA Investigation (Updates)"

newest oldest most voted

Was just about to send a tip on this one. This is a bit worrying. Remember people, the Tesla is NOT a fully autonomous car. Be safe out there.

“Remember people, the Tesla is NOT a fully autonomous car. Be safe out there.”

Yes, but it is certainly treated as one.

The problem is how do you determine when to intervene? If you do it late, then it causes accident, if it trust it too much and act late, then you can still get into accident…

In this case, it caused an accident that killed a driver.

Woaaa woaaa woaaa!!!! stop the bus ModernMarvelFan. How do you say that autopilot caused the death? It hasn’t even been investigated and from what has presented the driver never hit the breaks, most likely never saw the truck cutting over and what about the driver of the truck?
I don’t own a Tesla, not thinking of owning one and not a fan boy or such. I do however follow the technology and thus this website so don’t accuse me of such (just in case you were going to 🙂
I think there is more to be discovered before anyone pronounces the autopilot dead on arrival, right?

I remain convinced that an “autopilot” system that requires driver oversight is more dangerous than no system at all. Driver assistance is good up to a point. For example, a system that can slam on the brakes if an object is detected in the car’s path is great for safety. So is a system that can steer a car back into a lane if the driver begins to change lanes into an occupied lane. That sort of intervention enhances safety because it never lulls the driver into a false sense of security. The driver remains in full control of the car 99.9% of the time and realizes the system cannot save him from every mistake. It is merely an aid, an enhancement of the driver’s own abilities. Once a driving system begins to actually drive for the driver, it instantly becomes a safety hazard unless the system is perfect, or at least as good a driver as the average alert human. The driver’s attention wanders, he begins to text or even sleep, and so forth. Usually the system works, so he grows more confident in the system and pays even less attention. At this point he either doesn’t see hazards,… Read more »

But the technology *is* already safer than a human driver, if Tesla’s own statements are to be believed. Cases like these will keep happening, even if they become very rare, even when autopilot systems are 99% safer than humans. If you really have to eliminate the risk of accident, take the train (and even then…). Or you might just want to stay home.

It may or may not be safer than a human driver driving an *average* vehicle, but I suspect Teslas *without* Autopilot are actually safer than those with Autopilot.

Overall, Teslas have been driven over a billion miles with only one death under normal conditons [1]. That implies to me that Autopilot is 1000 / 130 = 7.7 times more deadly.

Only Tesla has the statistics to know for sure. If the NHSTA agrees, we could well see them disable the feature until they can show it is as safe as non-Autopilot Tesla.

[1] http://electrek.co/2015/12/22/man-dies-tesla-model-s-crash-dump-truck-first-death/

Or maybe you misinformers just jump on this topic to bash Tesla. You, sven, tftf, MMF … wouldn’t hit one key of your keyboards if it was another brand’s autopilot.

OTOH without the video and circumstances, nobody here can tell if this accident was avoidable all together, autopilot or not.

Your assumption is disgusting, especially since a person died in this accident.

I just tweeted about the FCAU incident (shifter issue, deadly accident with Hollywood actor) and I mentioned GM and Takata’s deadly accidents in the past in my SA articles (articles Apple potentially entering the car industry and related perils).

You probably didn’t read any of these articles or tweets – but keep making stuff up.

Yeah! I’m sure your sensitive heart is bleeding…
😉

Awaiting an apology, RexxSee.

lol!

What did I expect from Tesla boosters?

It’s incorrect to look at the particulars of this accident without considering all of the potential accidents that did *not* occur because of good driving, either on the part of a human, or on Autopilot. Therefore, total miles driven is relevant, and you can, very roughly, compare the safety of Teslas with Autopilot to those without, even though there is a margin of error in calculating both expected values.

I own a Tesla and have praised Autopilot on many occasions. I try to make points that others have not made. If you view my comments as negative, that’s because of your own biases.

“ou, sven, tftf, MMF … wouldn’t hit one key of your keyboards if it was another brand’s autopilot.”

I am going to call you BS on this. Apparently you are tesla fan who also got a very short memory…

I have been critical of all claims of “auto pilot” system from the start. In fact, I only agreed that Volvo’s assessment of the situation of 4 levels of automation. And many of you Tesla fan boys were grilling Volvo engineering’s comment about this exact concerns of giving out “false sense of security”.

Lastly, if you don’t have anything factual to say, maybe you should go back to your Tesla fan boi clubs and cry about this for few more days..

Frankly the Model S is a fantastic car already by being the first long range car, so there is really no need for over the top extra items like autonomous driving. It is just a bridge to far that can harm more than do good. If you want to drive, pass your driving license.

That’s just Tesla’s way of saying your average driver is an idiot. If you’re a sensible driver you are ALWAYS watching the road, even with autopilot on.

Tesla’s statement manipulates readers with statistics.

The fatality rate for autopilot miles is lower than that of other cars for ALL of their miles.

That is not a valid comparison. Autopilot isn’t used in all driving conditions. Instead it is used in the least challenging driving conditions – the very conditions under which fatal accidents are less likely!

A valid comparison would compare Autopilot fatalities to only the car fatalities for the exact driving conditions under which Tesla’s autopilot is used. Not easy to do, is it?

The once source for good data? Tesla. They have the computerized records to make exactly that comparison for exactly the same car models, with and without Autopilot. They undoubtedly have made that statistical analysis. It’s interesting that they chose instead to release another facile analysis instead.

I totally agree. Tesla says you should have your hands on the wheel, even in Autopilot. Then what’s the point of Autopilot? I think the name itself, and its association with the autopilot on an aircraft, is leading people to assume it’s capable of more than it can deliver.

The idea of it is to relieve some of the stress. Ever noticed how, when you’re a passenger you’re far more aware of hazards than the actual driver is? It’s the same principle. The car handles the basic yet mentally draining tasks like keeping in lane and slowing down when traffic in front slows down, freeing the mind up to focus entirely on one’s surroundings.

If you’re a passenger you’re more aware of hazards???

Maybe a backseat driver THINKS she is more aware of hazards. In my experience it is the driver who is attentive to driving. If the passenger is more attentive, then it is time to pull over and switch drivers so the less attentive one can get some sleep!

My Jaguar has adaptative cruise control, it brakes and accelerate as needed, however does not fully stop the car. So this is half automatization? Should be removed? Accidents unfortunately may still happening but the lesson learned will improve autonomous driving. Remember how we got to the safest aviation time. Learning from accidents.

Adaptive cruise control doesn’t steer the car for you or give you the feeling that you don’t need to pay attention to the road. Not even close.

ModernMarvelFan said:

“…it caused an accident that killed a driver.”

Did you actually read the article you’re commenting on? Because there is absolutely nothing whatsoever in the article that would lead any reasonable person to come to that conclusion.

There will come a time, sooner or later, when we can point to a fatal accident involving a self-driving car, and say “This probably wouldn’t have happened with a human driver.”

This is not that time. It’s not even remotely close to being that time.

The question here isn’t whether or not Tesla Autopilot caused the accident. The question is why Autopilot didn’t prevent the accident.

Apparently, Tesla’s explanation is that the car didn’t “see” the truck due to low contrast of the white trailer against the sky… just like the human driver didn’t (they suppose).

My question is, why is the car’s vision (apparently) limited to the same visible wavelengths as human eyes? Why doesn’t it have sensors that might have detected the truck in other ways… heat, ultraviolet, or radar to sense the physical presence of the truck?

Yes, we should know exactly why the radar didn’t see the trailer, but first of all, why the truck driver didn’t see the car because he should had see it by the location of the accident: https://goo.gl/maps/Mqgai6sVMxL2
Unfortunately, here in Portugal, I had already see some truck drivers (not the majority of them, but some of them) crossing in front of cars only because they are “bigger” and they assumed that cars “must” stop to let them pass! Human stupid behaviour will be the most difficult to be prevented by automatic systems. After all, more than 80% of fatal car accidents are caused by bad human behaviour and not by mechanical issues.

Daytime running lights make an approaching car in this scenario much more visible to the turning truck driver, and a good reason why they should be mandatory equipment on all new cars. There is no mention if the Model S in this accident was an updated Model S, which didn’t have daytime running light as of two weeks ago and was awaiting a software update/fix to remedy the situation.

http://insideevs.com/tesla-delivering-updated-model-s-without-daytime-running-lights/

Oops, I meant to respond to the Anti Lord Kelvin comment further down.

Good question.

I doubt the driver missed the truck because it was white. More likely he was lulled into complacency by the Autopilot, and thus wasn’t really attentive to the road hazards. He could have been snoozing, texting, or just plain vegetating. Basically what a passenger does on a long trip!

Reminds me of the arguments against seat belts, of the one in a million chance that you would have been better off it you weren’t wearing one

Indeed. Though it does not change the technical investigation of autopilot function, let’s not forget it appears the semi-truck attempted a dangerous left turn into oncoming traffic (unless the Tesla driver was speeding 20~30 mph above the limit).

“The question here isn’t whether or not Tesla Autopilot caused the accident. The question is why Autopilot didn’t prevent the accident.”

When a system caused the driver to let go by assuming it would prevent the accident while it didn’t, then it “effectively” caused the complacency that resulted the accident.

According to Tesla, this is the first fatality in over 130 million miles of Autopilot operation. Interestingly, the fatality rate for automobiles in the U.S. is one in every 100 million miles [1], meaning that, at least when it comes to fatalities, you could claim that autopilot and non-autopilot have nearly identical safety records.

It’s not strictly apples-to-apples, of course, since the Model S is generally much safer than the average American car, so there should be even fewer deaths given equal driving skill.

[1] http://www.iihs.org/iihs/topics/t/general-statistics/fatalityfacts/state-by-state-overview

Four Electrics wrote:

“…you could claim that autopilot and non-autopilot have nearly identical safety records.”

You could also claim that the moon is made of green cheese.

A sample size of one (1) is not sufficient for meaningful statistical analysis.

It’s not a sample size of one. It’s a sample size of 130 million miles.

All of this is actually rather complementary to Autopilot. To match a human fatally rate so soob, even in the world’s safest car, is rather good.

Yes, but. Do you understand how statistics work and what a statistical sample is?
So you have a sample based on billions of miles traveled vs. a single incident in 130 million miles traveled.

Chances for fatalities are abundant. Let’s say that one potentially fatal event exists, and must be avoided, for every one hundred miles that a car is driven on Autopilot. This could be a lane change, a car that cuts you off, breaking in heavy traffic around a blind turn, avoiding overpass columns, an oncoming car making a left turn, etc. that the Autopilot must handle correctly. If it performs correctly, there is no fatality. If it doesn’t, there is.

Each one of these events is, in a statistical sense, an independent trial. In the Autopilot case, we have 1.3 million trials, because it has driven 130 million miles. According to the law of large numbers [1], the measured result will not diverge significantly from the expected value. The sheer number of trials here is impressive, and statistically sound, which is why Tesla quotes the number in their own blog post.

[1] https://en.wikipedia.org/wiki/Law_of_large_numbers

Miles driven on an auto-pilot are not the same miles driven manually. So it is not really comparable numbers, for statistical purposes. Drivers will not engage an auto-pilot in certain situations (when they know it would not work, unsafe, etc.) So auto-pilot is used in, let’s say, 50% of the total driving scenarios (I would expect it to be less risky than the remaining 50%), but humans drive all 100% scenarios (obviously). The comparison made by Tesla is not entirely correct. To properly compare human miles vs auto-pilot miles, total miles driven should be compared under similar conditions (for instance, highway driving only).

What the 1 death in 130 million miles suggests is that the event is not common and that it is a similar frequency to regular road accidents. It is significant depending on what statement you are trying to test. So if the statement you are trying to justify is “autopilot will result in a death every 100 miles driven” then this data pretty much shows that statement to be false. If the statement you are trying to justify is “autopilot is safer than regular driving which results in a fatality every 100 million miles” then you require further data. Statistics aren’t typically used in a situation like this. Normally you determine why the accident happened. I assume this is a fairly simple case of looking at the systems on the Tesla and determining the detection limit. The implication from what Tesla has written is that neither driver or car could see the trailer, if that is the case then it may be necessary to alter the road or the laws around trailers to make them more visible. Rather tragically if this situation occurred in the UK the trailer would have had to have a barrier between the wheel arches. That… Read more »
There will always be difficult lose lose situations where a human will take another decision than a software. For instance if you arrive at the top of a hill with two lanes and no place on the sides with a car coming from opposite direction; at that very moment a dog runs over the street. What will the car do? Will it drive over the dog or will it change lane and go in a frontal collision with the other car. A human would likely drive over the dog but go in a frontal if the dog is replaced by a child. Instinctively we analyze and take into account different parameters like a dogs like is kind of worth less than a child’s life. We also decide that the child will be sure death if we run over it while we may have a chance to survive in a frontal collision. Of course, that may not be the case and we may end up killing ourselves to save the child and killing the people in the other car including children’s as well. But at that very moment it is a human’s attribute to be at the control and take the… Read more »

Four Electrics

“It’s not a sample size of one. It’s a sample size of 130 million miles.”

No, it’s a sample size of one (1).

I know that statistics isn’t the easiest subject to understand, Four Electrics, but the point here shouldn’t be that hard to grasp. If there had been just one more accident in those 130 million miles, that would be a 100% increase in the rate based on just one (1) accident.

And that’s why just one, or even just 25 or 50, isn’t a statistically meaningful sample size.

Remember the media buzz over just three (3) battery fires in plug-in EVs, a very few years ago? Again, not a statistically meaningful sample size, yet there were a lot of comments at the time using the false premise that those three were sufficient to estimate the rate of car fires in plug-in EVs.

So, if I survey 130 people and ask them if they like your posts, and 1 person says yes, and the rest say no, I have a sample size of 1?

Yes, you certainly understand statistics.

This. Just because there is only one accident does not mean that there has been only one sample. In fact, I would guess there are almost a million samples, almost all of which were negative, given the somewhat arbitrarily picked ratio of one avoidable fatality for every hundred miles. Let me give you another example: let’s assume you have a biased coin that comes up heads a million times more than it comes up tails. You flip that coin 999,999 times and it only comes up heads. Does that imply a sample size of zero? Now you flip it one more time, and it comes up tails. Does that imply a sample size of one? No, it implies a sample size of one million. Even if you don’t know exactly how the coin is biased, the fact that you flipped it a million times and it only came up tails once gives you a lot of confidence to say what the approximate bias of the coin is, and what the margin of error is. For a million trials, the margin of error is quite small, by the Law of Large Numbers [1]. So, again, coming full circle, the fact that… Read more »

Similarly, for the Tesla automobile fires, you must also count the number of accidents that *didn’t* result in a fire, both among the general population and in Teslas, as samples. In that case, Teslas were 10x more likely to result in a fire *given a collision*. Now that the titanium shield has been added, that rate has, as expected, dropped.

You should compare it to the particular state, in this case FL. 1.57 Due to the great variation between states, almost 3 times the fatality rate from lowest to highest.
Also PUPM, has a point due to the small size of the statistical sample.

Bad comparison. Autopilot isn’t used in all driving conditions. So you’re comparing fatalities in all driving conditions among the general car fleet to Autopilot fatalities under the least challenging driving conditions.

Condolences to the family. Tragic series of events.

If I were to guess, the same flaw that allowed a previous Model S in “Summon” mode to drive under a bigass trailer (second pic) is the cause.

Trollnonymous said: “If I were to guess, the same flaw that allowed a previous Model S in “Summon” mode to drive under a bigass trailer (second pic) is the cause.” Good point. In the previous case, it appears (at least to me) that the Model S was parked too close to the trailer for its sensors to “see” the projecting objects; the objects were outside the forward-facing radar’s cone of detection. In the case reported above, common sense suggests the tractor-trailer would have been outside the angle of detection of the forward-pointed radar until it was too late to brake to avoid the accident. As has been noted in previous analyses of Tesla AutoSteer, the ultrasonic detectors which point to the sides and rear of a Model S have much too short a range, and thus a reaction time much too short, to properly react to a vehicle approaching at high speed. Google self-driving cars use a scanning, rotating lidar emitter mounted on top of the car. This is exactly what is needed, to “see” in all directions, which is what is needed for a fully autonomous vehicle. Tesla’s series of fixed-mount sensors… Well, let’s just say that Tesla will… Read more »

Yea, – LIDAR gives a fuller “point cloud” around the car, than just front facing cameras – But it has it’s limits as well..

https://www.theguardian.com/technology/2015/sep/07/hackers-trick-self-driving-cars-lidar-sensor

I known Elon didn’t agree with the Google approach (cost or complexity, I believe), and MobileEye is going with Cameras as the primary sensor.

My personal sense, is for L4 autonomy, you going to ultimately need both sensor platforms, to deal with bad weather (snow, driving rain), that can partially blind one or the other.

Also – talk about bad timing… Considering the accident happened in May This announcement can’t be the best of timing for BMW, Intel and MobileEye

http://www.reuters.com/article/us-bmw-mobileye-intel-idUSKCN0ZG0IJ

“Remember people, the Tesla is NOT a fully autonomous car.”

All too correct.

That’s why I think Tesla shouldn’t have hyped its driver assist features by labeling the group “AutoPilot”. This set of features is a long way indeed from being a fully functional autopilot for a car.

And it really disturbs me to see all the comments asking why the Model S that ran off in a parking lot and into the wall of a store, didn’t engage automatic breaking. These comments indicate there are far too many people who apparently think the Model S should already be acting like a fully autonomous car, overriding driver operation of controls.

+1. Autopilot is not a good name for this feature, at this time.

That is ridiculous. What does the name have to do with anything? I can call an elephant a lion, I can write in on the side of the elephant here is a lion, that don’t make no elephant no lion….
Seriously, we as a society refuse to accept responsibility for our actions or in actions. Not my fault, not my problem. And I’ll prove it by blaming someone, something else and I’ll legitimize it by suing you cause you got money and I deserve some of that.
Anyone hearing what I’m saying?

A product’s name can imply that the product has certain functions or will perform a certain way. An airplane manufacturer could also put large backpacks on its planes and call them “parachutes,” but in an emergency when you really needed to use a parachute, you’d be very disappointed and upset to find out that although the large backpack is named a “Parachute,” it doesn’t function/perform as a parachute for its intended purpose. EVGuy said: “Seriously, we as a society refuse to accept responsibility for our actions or in actions. Not my fault, not my problem. And I’ll prove it by blaming someone, something else and I’ll legitimize it by suing you cause you got money and I deserve some of that.” Determining fault and assigning blame is not always black and white, and oftentimes neither party is 100% at fault. The common law recognized this and came up with comparative negligence and contributory negligence, assigning a percentage of fault (0% to 100%) to each party. Companies, just like people, owe a “duty of care” not to harm anyone. Breaching this duty of care not to harm others is the tort called Negligence. Companies who make defective products that harm others… Read more »

Obviously Sven you have never run or been an owner of a small company who has to pay a kings ransom for insurance due to frivolous lawsuits brought by people who have an sense of entitlement that doesn’t in fact exist. FORTUNATELY I’ve never lost to a frivolous lawsuit although I’ve been named in a few.
My point is and was we as individuals need to own our problems.
IN THIS CASE Tesla has been very clear about it’s system, it’s not autonomous and anyone who buys a Tesla thinking otherwise should consider another car.
On this we are clearly worlds apart. The name did not kill the driver, the system didn’t kill the driver. The driver(s) failed to recognize the hazard and the result was tragic.
Had Tesla called it super cruise would the expectation be that it somehow has super powers?

Doug said:
“. . . the system didn’t kill the driver. The driver(s) failed to recognize the hazard . . .”

Tesla’s collision avoidance system didn’t activate emergency/automatic braking, because it failed to detect the broad side of a huge tractor trailer at a 90 degree angle to the roadway, blocking both eastbound lanes of traffic on the highway. At this point, it’s premature to absolve Tesla of any blame/fault with regards to this accident. Like I originally said, “Determining fault and assigning blame is not always black and white, and oftentimes neither party is 100% at fault.” Both individuals and companies in our society oftentimes refuse to accept responsibility for their actions or inactions.

+1

Individuals often refuse to accept responsibility (usually the ones who refuse are the same ones who lecture others on “personal responsibility”).

Businesses almost never accept responsibility, and instead hire lawyers to analyse the legal liabilities. A settlement is made if the liabilities are deemed too great for the company to risk. Even then, responsibility is often dodged by forcing the victim to sign an NDA in exchange for cash payment, aka, hush money.

LOl! Your parachute analogy cracked me up 🙂

As I pointed out in the Model X lemon law article, Tesla doesn’t mention its Autopilot is beta when selling the car. Only mentiones that in some legal disclaimer and owner’s manual. Tesla allowed Phil Lebeau to shoot hands free videos during autopilot launch and show those on CNBC, while Tesla employees sat in the same car watching. The entire autopilot thing has been a hugely irresponsible act.

Yep, right amidst the deluge of other legal disclaimers, including the one about always engaging the parking brake when parked. How many people use the parking brake in cars with automatics? Probably a lot more do in places like San Fransciso, but on flatter terrain? Not many.

What’s wrong with calling it improved cruise control? After all that is what it is. OK it may be an understatement but that would bring the people to understate the expectation as well and not consider it as a self driving car.

Autopilot is safe as long as you don’t forget to stand by and watch. We have seen that with the Airbus A340 Rio Paris that went down because a simple pitot tube freezing caused a sudden autopilot switch off that took almost sleeping pilots off guard and unable to respond on time.

My heart goes out to the family.

I’ve always counted on Tesla to step up to challenges in a way that other companies don’t. They must treat this as an opportunity to learn and improve, so that nothing like it can ever happen again. Make the autopilot system recognize white against the sky somehow, and put in sensors for higher objects. Whatever it takes.

Having an infrared camera could probably help. But they are very expensive.

That’s an advantage that these autonomous systems can have . . . they can incorporate many sensors that sense things that we cannot see/hear. (infrared, radar, cameras in all directions, etc.)

“Having an infrared camera could probably help. But they are very expensive.”

Is is still expensive on a $100K car?

Many German cars and Cadillac had it since the 90s.. Yes, IR camera for night vision.

Cameras produce a flat image. That’s not what autonomous cars need.

Self-driving cars need radar (or lidar). As a reminder, RADAR means RAdio Detection And Ranging. Detection and range to obstacles, especially moving obstacles, are precisely what’s needed. Not flat images from a camera, which have to be interpreted by the software. Thus far at least, computers do a very poor job of recognizing objects from flat pictures. That’s why Google’s self-driving cars use scanning lidar. If Tesla was using that, instead of a radar which detects objects only in a fairly narrow front-facing arc, then perhaps this fatal accident would have been avoided.

Computers aren’t human brains. Humans have excellent visual acuity, and our brains are amazingly good at interpreting what our eyse see. Computer software and camera images… not so much.

“Cameras produce a flat image”

If the (s) at the end of the word of Camera is indication of plural use of the device, then it can create a 3D image with more than 1 camera.

Then why doesn’t Google use that system for its self-driving cars?

Because it doesn’t work reliably, that’s why.

If you want details, here is a link to a paper. It looks to be several years old, but I think it’s still a valid summary of the state of the art:

https://www.cs.rochester.edu/~nelson/research/recognition/recognition.html

And to be clear – Google uses BOTH LIDAR & RADAR, they use LIDAR (laser/light based dectection and ranging) for the point cloud out to 100M- and then bumper mounted RADAR for speed detection of objects closer in. (Source: http://www.extremetech.com/extreme/189486-how-googles-self-driving-cars-detect-and-avoid-obstacles

They also need “hyper accurate” GPS to basically get on the right road, to let the vision stuff “stay between the lines”.

Tesla is based on the MobileEye EyeQ3 platform with 8 cameras (and close in RADAR + SONAR) – Source: https://teslamotorsclub.com/tmc/threads/mobileye-and-tesla.56792/

and more details that are Tesla/Audi specific are:
http://wccftech.com/tesla-autopilot-story-in-depth-technology/5/

FYI, the next MobileEye EyeQ4 platform is a major step forward and due to ship in 2018.

“Self-driving cars need radar (or lidar)”

Tell that to a certain Elon Musk who thinks Tesla cars don’t need LIDAR.

http://9to5google.com/2015/10/16/elon-musk-says-that-the-lidar-google-uses-in-its-self-driving-car-doesnt-make-sense-in-a-car-context/

A few more severe accidents and will likely revisit this decision.

Modern camera sensors are inherently sensitive to near infrared. People prefer the images without the infrared so it is filtered out.

I doubt an infrared camera would help much, though.

You don’t do a “public beta” when lives are on the line. Your system has to work. This is not the kind of software that can only have a minor impact if it malfunctions.

It is safer than humans if you use it to assist driving and keep your attention AND HANDs on the wheel. Too many don’t I fear.

Quote: “This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.”

WEll, if you are the first one, then it would suck to be you…

The fact remains that “uncertainty exists” which makes your job harder to decide when to take over…

Kind of warped logic, to call Autopilot safer than humans and then say “as long as hands are on the wheel”, at the ready, etc. I don’t see that any better than AP sees white.

This is tragic and I fear lots of people haven’t gamed in their heads the scenarios that could kill them. Others can flame me, but I take my eyes off the road for seconds at a time, when in stop & go. Beyond that, we all use judgment in estimating where AI will, or won’t, be there for us.

Do we all estimate that the car has a slightly tougher time with white? Probably not. Intersections? Too soon.

Not trying to minimize a death at all. Incredible sad at any young age and even more when you have a family of your own. A point about “safer” based on several roadtrips including a 3400 mile one where Auto-Steer/Auto-pilot drove 95% of that (3000+) is that AS/AP stayed more more centered in the lane than most drivers around me. It was crazy to see how many vehicles crossing the center line or drive on the rumble strips.

Say what you will, we both know the majority of accidents happen due to simple human errors like falling asleep or stomping on the wrong pedal and running into a wall.

Statistics don’t mean anything without context to analyse. If I were to guess, I’d say that Autopilot is usually activated while the car is in a lower risk situation.

Boukman June 30, 2016 at 6:07 pm You don’t do a “public beta” when lives are on the line. It’s hard to think about things objectively, and consider what is the greatest good of the greatest numbers, when real individual people are involved. It’s especially hard in a case like this, where a real person has died. But we need to step back and look at the whole picture. Self-driving cars will never, ever prevent 100% of accidents. Let’s not make this the ultimate case of “The perfect driving out the good.” Let us not advocate for banning “beta” versions of partially autonomous cars just because some have died when driving one. This is the first, but others will inevitably follow, because riding in a vehicle moving at high speed on roads where other vehicles are moving at high speed in different directions, is inherently unsafe. Let us consider whether or not the partially autonomous systems in question have already saved lives. If it’s reasonable to believe they have, then a single death, however tragic that is to the people who knew the deceased, is not a rational reason to ban Tesla from using its driver assist features, nor is… Read more »

I am sorry for their loss and this makes me angry with the way Tesla markets this driver assisted technology. Autopilot is a very deceptive term because true autonomous driving technology is atleast a decade away. It is inevitable that drivers get used to the technology and get distracted no matter how much Tesla warns. Also I don’t understand how ”driver should always stay alert’ idea goes with ‘autopilot reduces workload’. Can’t the same be said for a cruise control?

I like EVs and Tesla but there is no need to push this technogy aggressively in beta mode. They can install sensors and learn from simulations and driver data and not use actual customers as beta testers. I like Tesla and can’t wait to own one but I am definitely not going to take autopilot option however tempting it is.

There is nothing tempting about sitting in the driver seat not doing anything while the car drives you.

Sounds like a sick dream

It’s not a sick dream, it’s an inevitable future for the auto industry. Wanna know what I think is ‘sick’? Sitting in a traffic jam. Perfectly good place to let the car handle the stop start crap and steering. Low speeds, so lower risk, just less stressful overall.

MatteM3 said: “I don’t understand how ‘driver should always stay alert’ idea goes with ‘autopilot reduces workload’. Can’t the same be said for a cruise control?” Yes, and a more able cruise control is exactly how drivers should be treating Tesla’s driver assist features. Unfortunately, as has been made clear in many videos posted to YouTube, all too many drivers are treating AutoSteer (Beta) as if it makes a Model S a fully functional self-driving car, even to the point of turning away from watching the road and doing something else… or even, in the case of a few idiots, deliberately disabling the Model S’s safety features to force the Model S into driving itself when nobody is in the driver’s seat! Now, that’s not to say that Tesla is entirely 100% blameless. By labeling its driver assist features “AutoPilot”, and by continuing to allowing AutoSteer (Beta) to operate on roads with two-way traffic — despite Tesla’s clear instructions not to do so — Tesla seems to be opening itself up to partial responsibility. However, let is remember that in a well-ordered society, responsible adults operate under the principle of “I’m responsible for my own actions.” Drivers may want to… Read more »
As a pilot who has flown aîrcraft with just Elevator Trim, and aircraft with both Elevator Trim and Rudder Trim, I prefer the latter! Also having flown Mooney 231 Turbo Charged Aircraft, at speeds to 200 plus MPH, at altitudes to 17,000 feet above Sea Level, both Hand Flying, and using Autopilot, I prefer Autopilot! Also, please note, common people put romantic ideas onto things they don’t truly comprehend, and Autopilot is one of those things. The big issue is Tesla uses Aviation titles, but does not require product training to the level Aviation does! First off, and Autopilot in an aircraft follows basic assigned tasks: maintain altitude, follow commands from navigation/flight director inputs, etc., but it DOES NOT see conflicts, from other aircraft, and avoid them. Autopilot, is not there to protect you, only there to relieve flight crew of focusing on basic, mundane tasks, so they are not drained mentally when their skills and experience are called upon. In a Tesla, I have not heard claims of 110% capabilities in road tracking, traffic avoidance, and Total Coverage of ALL End Cases! Unfortunately, many, most, or ALL drivers are not given road tests every two years where they have… Read more »

You said it yourself, autopilot REDUCES workload. Key word: REDUCES. not TAKES AWAY.

Tesla puts cameras for autopilot standard in their cars but can’t simply install dash cams in all its cars which would be the most beneficial.

+1

It’s sad that someone had to die for autopilot to go away. I hope nhtsa goes after Tesla. Elon, do it the right way like Subaru eyesight.

Would eyesight stop the car in this case.

I mean, the trailer is probably higher than the car where the sensors on the bumper will detect an empty space below the trailer. But the roof would be scrapped off when the car runs into the side…

Then again, I don’t know if the Subaru’s sensor has higher coverage or not.

Anyone has the information?

http://s3.caradvice.com.au/wp-content/uploads/2013/09/subaru-eyesight.jpg

It appears the Subaru Eyesight system would probably stop or try to stop the car, since its stereoscopic cameras to the left and right of the rearview mirror “scan” the road ahead up to 87 yards ahead of the car.

“EyeSight employs a pair of forward-facing stereoscopic cameras mounted inside the car on either side of the rear-view mirror, which are connected to the throttle and brakes. The cameras and corresponding software ‘scan’ the road ahead (up to 87 yards ahead of the car) and can initiate a series of collision warnings and avoidance measures if it determines a collision is imminent. . . . The adaptive cruise control can bring the car to a complete stop from speeds of up to 87 mph, but drivers are prompted to accelerate from a stop—the system won’t get the car going again on its own.”

http://blog.caranddriver.com/we-try-out-subarus-eyesight-collision-avoidance-tech/

What is the max height the scan covers?

The problem with the autopilot is the auto steering. They need to get rid of that feature, asap.

There was no steering problem with this accident.

It likely would have caused trouble with any of the automatic braking systems on the market now, including the Subaru Eyesight.

You don’t get it. Auto steering disengages you from the driving loop. If the Tesla driver had paid attention, this accident would have been avoided.

First of all maybe if the truck driver paid attention to traffic, the car didn’t have to stop! We now it was a sunny day and by the location we know that the truck driver had plenty of sight of the road to see coming the Tesla and NOT engaged in the road crossing!

Daytime running lights make an approaching car in this scenario much more visible to the turning truck driver, and a good reason why they should be mandatory equipment on all new cars. There is no mention if the Model S in this accident was an updated Model S, which didn’t have daytime running light as of two weeks ago and was awaiting a software update/fix to remedy the situation.

http://insideevs.com/tesla-delivering-updated-model-s-without-daytime-running-lights/

Indeed. Though it does not change the technical investigation of the autopilot function, let’s not forget it appears the semi-truck attempted a dangerous left turn into oncoming traffic (unless the Tesla driver was speeding 20~30 mph above the limit).

What I wrote in another forum:

This seems to be an accident that, if the driver had had his eyes on the road, hands on the wheel, and _brain engaged_, probably could have been avoided. Autopilot will likely be found a contributing if not the primary cause of this accident, depending on the visibility conditions at the time, if a normally alert driver should have seen the trailer. Irregardless, it will require modification of the vertical coverage of the autopilot sensors, as well as a possible addition of multi-spectral sensors. Tesla’s account implies there was a visual contrast issue, which shouldn’t be an problem for lidar/radar sensors given adequate vertical as well as horizontal coverage. Here’s an account and comments on Autopilot’s sensors:

https://www.quora.com/What-kind-of-sensors-does-the-Tesla-Model-S-use-for-its-autopilot-auto-steering-features

When I drive on the congested highways around NYC, I’m not only checking what’s happening in front of my car, but also constantly checking what’s happening on either side of my car, and what’s happening behind me.

GRA said: “This seems to be an accident that, if the driver had had his eyes on the road, hands on the wheel, and _brain engaged_, probably could have been avoided.” You jumped rather far to get to that conclusion, didn’t you? Here are a couple of questions that immediately came to my mind; questions which should be answered before we should think we have sufficient facts to reach even a tentative conclusion: 1. What time of day was it? Was it near dawn or dusk? Was the driver partially blinded by glare from a sun near the horizon? 2. Did the driver have a clear line of sight on the truck as it was approaching the highway, or were there trees, buildings, or other large obstacles that blocked his view until it was too late to avoid a collision? No doubt a professional accident investigator would have a long list of additional questions. Your conclusion, GRA, assumes quite a few facts not in evidence. “Autopilot will likely be found a contributing if not the primary cause of this accident…” Sorry, we’re not buying the argument that the driver would have been sufficiently alert to avoid the accident if only,… Read more »

Please do read the follow-on sentences to the first one you quoted, before accusing me of reaching conclusions based on insufficient evidence. To wit, “Autopilot will likely be found a contributing if not the primary cause of this accident, _depending on the visibility conditions at the time, if a normally alert driver should have seen the trailer_.

Most unfortunate.

Gives new meaning to the notion that Model S owners love their cars “to death”. Personally I fault Tesla for touting how complete autopilot is when it’s clearly not all that bullet proof. That was obvious when the Model S in summon mode ran into the back of a tractor trailer. I could also fault people who believe the Tesla hype but, on balance, it’s hard to blame someone for their beliefs. All the excuses about how rare this accident is just plays into the problem.

Poor DonC. I see his TES* is acting up again.

Take the cure! Just dump your short position in Tesla stock, and you’ll find immediate relief from your compulsion to post fact-free FUD!

*Tesla Envy Syndrome

electric-car-insider.com

Was driver lulled into inattention and complacency? Beware of auto-hazard.

electric-car-insider.com

From CNN report:

“Experts have cautioned since Tesla unveiled autopilot in October that the nature of the system could lead to unsafe situations as drivers may not be ready to safely retake the wheel.

If Tesla’s autopilot determines it can no longer safely drive, a chime and visual alert signals to drivers they should resume operation of the car. A recent Stanford study found that a two-second warning — which exceeds the time Tesla drivers are sure to receive — was not enough time to count on a driver to safely retake control of a vehicle that had been driving autonomously.

Tesla’s cars are built with an auto-braking system, however it is not foolproof and did not activate in this crash. “

Thank you for posting this.

That is my problem with the autopilot system.

It isn’t the system alone. It is more a system integration issue which involves driver, car, SW and education.

After the Tesla had its roof sheared off, it continued to travel east on the highway until it crossed the south-side shoulder, smashed two fences and struck a power pole. It appears the autopilot deactivated after striking the trailer and disabling the camera in the rear-view mirror. The Tesla finally stopped about 100 feet south of the highway. “In a separate crash on May 7 at 3:40 p.m. on U.S. 27 near the BP Station west of Williston, a 45-year-old Ohio man was killed when he drove under the trailer of an 18-wheel semi. The top of Joshua Brown’s 2015 Tesla Model S vehicle was torn off by the force of the collision. The truck driver, Frank Baressi, 62, Tampa was not injured in the crash. The FHP said the tractor-trailer was traveling west on US 27A in the left turn lane toward 140th Court. Brown’s car was headed east in the outside lane of U.S. 27A. When the truck made a left turn onto NE 140th Court in front of the car, the car’s roof struck the underside of the trailer as it passed under the trailer. The car continued to travel east on U.S. 27A until it left… Read more »

Long, straight 4 lane road. 65 mph speed limit (so people are probably doing 70) …. looks like an interstate, but it’s not. Plenty of roads/driveways entering the road from 90 degrees (unlike an interstate) , .. so in that sense it’s kind of dangerous to start with. It looks like a setup where you might think it’s really safe/boring but you better stay alert.

Sounds like it was good weather and at 3:40 in the afternoon he should have certainly seen a big truck, but somehow didn’t.

In January, Tesla updated the autopilot driving systems in Model S sedans to put new limits on its hands-free operation, which has been both praised for its innovation while criticised for having been launched too early.

The function will now be restricted on residential roads or roads without a centre divider, meaning the car cannot drive faster than the speed limit maximum plus 8 kilometres per hour.

..add,

On 2nd google maps look– there does appear to be a slight hill/elevation change which could be a visibility factor.

From the description of the accident it sounds like the semi just pulled out in front of the tesla.

That’s impossible because semi was almost cross the highway. The driver wasn’t paying attention.

“That’s impossible because semi was almost cross the highway. The driver wasn’t paying attention.”

Or overly trusting “auto pilot”…

That is the problem. Now drivers have to make additional decision on whether to take over the control or allow “auto pilot” to act on its own. That is almost a more difficult decision than just driving manually.

We only have split sec to decide, there should be zero hesitation. Autopilot causes hesitation, therefore it should be banned until it’s fix.

You are assuming the truck slowed way down or stopped before it across the intersection. However if the truck never stopped and keep moving at around 20+ MPH the Tesla would have had less than 2 seconds to respond. Everyone keeps assuming things without all the facts. We don’t know how fast the Tesla was driving. We don’t know how fast the truck was driving through the intersection.

The only fact that is not in question is the truck driver enter the intersection without first clearing and making sure it was safe.

Indeed. Though it does not change the technical investigation of autopilot function, it appears the semi-truck attempted a dangerous left turn into oncoming traffic (unless the Tesla driver was speeding 20~30 mph above the limit).

This same person admitted a month before to NOT pay attention to the road and relied on the auto pilot to keep him from running into another truck. Just curious how Tesla has had 130 million miles driven with auto pilot without incident..on the road, a lab or on a simulator. Hell airplanes have autopilot, but that doesn’t mean you see the pilots sitting in the main cabin during a flight! A 100 things can go wrong and that is why they watch and maintain complete control in case of a large white object comes into view. Please stop relying on technology to keep you safe or keep ones brain from engaging from being present, not so one can text or play chess while driving!

https://teslamotorsclub.com/tmc/threads/fatal-autopilot-crash-nhtsa-investigating.72791/page-6 in post 104 suggests that it’s quite likely the “driver” was on his laptop at the time of the accident.

But yes, this is exactly the problem w/technology that under certain conditions works most/almost all of the time, which then causes people to do other things and not pay attention. And, there will no doubt be many people who don’t understand the limitations since they won’t RTFM or aren’t interested in learning.

Yep. It’s human nature to trust a system that works 99% of the time. In this guy’s case, Autopilot actually saved his arse once already, so he was understandably even more overconfident.

If you make a system that lets the drivers attention wander, it had better be damn near perfect. To put out a beta of such a system is dangerous and irresponsible.

Tesla said “As more real-world miles accumulate…. the probability of injury will keep decreasing”

Don’t they mean that the ‘rate of injury’ will keep decreasing. The likelihood that there will be another injury will increase with more miles driven.

“Per 10^x (x = const) miles driven”, the probability will decrease.

The probability for each individual driver decreases.

Fatalities are just a learning tool for the Autopilot mothership’s AI.

Unfortunately I have to agree with multiple criticisms of Tesla and their ‘Auto Pilot’ technology here.

1) Don’t call it ‘Auto Pilot’ when it is no such thing: the user instructions call for the driver to stay ready to pilot the car at any moment so it is in no way ‘auto pilot’ in the common understanding of the term.

2) To say that neither the driver nor the car noticed the ‘white truck’ against the ‘bright sky’ is pretty ridiculous. Any driver paying attention WILL notice a tractor trailer crossing the road in front of them.

I can’t believe it’s the same guy that posted on YouTube where the S avoids the crash.

http://www.dailymail.co.uk/news/article-3545488/Tesla-s-autopilot-saves-driver-high-speed-crash-quickly-swerving-way.html

What a tragedy.

If this the same

Guy that posted the previous video, it is clear at least to me that this person used auto pilot for almost everything. It does seem like a safe technology, but I really don’t think we are fully autonomous yet.,

I guess if you use this all the time, and just stop paying Attention , this might just happen to others. Be careful out there.

So, it shows the “incomplete” system is useless.

Because you have to make decision to either trust it or not trust it.

When you don’t, it is fine. Or not fine when it is too late to brake as in the case of that lady in LA when rare ended the car in front of her by braking which automatically disengage the auto braking system.

When you do, it is mostly fine until this situation and you pay for it with your life or the case in EU where the car ran into a parked car on the road.

I would rather not making the last second decision and just live with Level 1 system rather than using the uncertain level2 system.

Or I will wait for the full Level 4 system.

Preach it brother!

For one, besides improving Tesla’s detection I think trailers should have some mandated reflectors, required by many states, but something specifically designed to reflect lidar, and say I am here, I am big. No matter what color it is painted.
So a combination of integrated, complementary, safety measures should be focused on as the entire driving fleet worldwide becomes more autonomous.

Also condolences to the family and friends of the deceased.

While it might indeed be a good idea to require trailers on tractor-trailer rigs, while operating on public roads, to have lidar reflectors, it wouldn’t have helped here because Tesla cars don’t use lidar in their sensor suite.

What I think is needed are sensors on the roof of the car, not just on the bumpers and sides.

The sensors detected the semi but thought it was one of those overhead signs.

Agreed – The primary camera is near the roofline, by the rearview mirror. I’m quite sure the camera “saw” the truck, it’s a fairly straight road – A truck trailer is rather large, the real question, was the contrast and the camera white levels such that it couldn’t see the “edges” of the object.

Or did is “detect” the object, yet since it was stationary, and had space below it, did it decide it was an “overhead sign”. Given this happened 6 weeks ago, I’m sure Tesla has reviewed the logs, and knows the answers to both of those.

That all said, at 3:40 in the afternoon, in good weather, and a straight road – the driver shouldn’t have missed a truck across the road.

Clearly some driver error there, and “over trust” in the Autopilot system. As well as “room for improvement” on the Tesla SW side of things as well

I’m very sorry for the family .

So far sounds like the truck is the wrong height and not visible by the driver or auto pilot assist. I’d say the truck needs to be redesigned so both could see and avoid it.

So I just checked the Tesla website and it doesn’t say beta for autopilot. As a consumer it is not clear that it is a beta feature.

It is specifically the AutoSteer function which is (or was) labeled “Beta” on the car’s onscreen display. Not AutoPilot in general.

http://www.topgear.com/sites/default/files/styles/fit_1960x1102/public/images/news-article/carousel/2015/10/c255dc8aa03219aba456044acb7b02b0/p1080158.jpg

Tesla needs to back off from the PR gimmicks and concentrate on accidence avoidance which is what these cars are really about. People really believe they drive themselves and that’s simply not true and is confusing to the common driver…they don’t know when to trust the car and when to take control… they are self driving only under certain conditions.

That accident reminds me of one i’ve had in the 80’s. Exactly the same scenario, but at a lower speed, 50mph. My coworker was following me as we had just finished working. a lady, comming from the oposite direction, turned left right between us, clipping the back of my car. I went sideways hit a patch of ice and when the tires of my car made contact with the paved road, my car went up it the air rolling 3 times and landed on it’s side. Apparently, i got out of the car ans started explaining what happened but i don’t remember that part. All i remember is i woke up in the hospital next to my coworker who explained to me what happened. When the car turning left clipped mine, it stopped and my coworker rammed her so hard his steering wheel had both sides bent and his right fist rammed the radio and endented it! There was no way we could have avoided that accident, no way! And that was just at 50mph, imagine 70mph? As a 20 year bus driver, i’ve seen a lot of accidents and we were alwys told there are avoidable accidents and unavoidable… Read more »

My sympathies to the family and friends of the man involved in this tragic crash.

I would agree with those calling to rename AutoPilot to something that indicates that it is not to be left unattended. DriveAssist, etc. Make it clear that it’s a tool to assist, not automate, the pilot. Marketing is getting in the way of safety here.

I’m wondering if this statistic of a fatality every 92,000,0000 miles is a fair comparison. What I’d like to know is, who’s a better driver…someone that’s sober, buckled up, and paying attention…or AutoPilot? I would imagine it at least doubles to 180,000,000 miles per fatality. Is it fair to compare AutoPilot to drunk, unbuckled, and distracted driver statistics? We’re going to need more miles under DriveAssist to gauge that anyway.

I have criticised Tesla before about treating their customers as guinea pigs but every time you criticise Tesla on this forum you get called a Tesla hater, troll, short seller and a shill for big oil!

I don’t think you can compare the fatality rate of autopilot with manual driving. Not only because of the sample size of one but also because the autopilot is only engaged in the easiest of driving situations. Manual driving is “engaged” in all situations. If autopilot were active at all times it would likely have caused a lot more accidents.

Every fatal victim is one too much, therefore my thoughts are with the family and friends.
After all that has been said and argued above, let me add some remarks:
Most US trucks and Trailers are technically outdated compared to European vehicles. We had the same fatal incidents here in Europe, decades ago, when fast driving sports cars passed under trucks or trailers. Nowadays, a truck or trailer will not be type-certfied, unless it has barriers between the wheels and at the rear to prevent passing under from any side.
Next, I would wish that some other official instance would investigate against Smith & Wesson (and others) in each of the 30,000 US fatals per year because of gun accidents in the same acribic manner. Ae long as it is easier to purchase a gun than a harmless plasic box called cooler, this will go on. As I said, every fatal is one too much.

Good Point Dr. Steelgood .
Most truck outside the USA have the safety features that you talk about.

From the info I read here, I wouldn’t put the A/P issue first… For someone who doesn’t live in the US, I find it quite strange and dangerous that a highway where you drive over 100 km/h can be intersected like that. Plus if the truck was crossing or entering, surely the truck driver did not have right of way. So in this part of the world, it’s call homicide. I know some will reply that it takes forever for a truck to get onto a highway… that’s why onramps exist.

About A/P, I’ve tested it myself and understand it’s beta and need to focus all the time. It makes driving much less tiring for sure. But I can see how most people will assume they have nothing to worry about when it’s engaged. So my suggestion is to make it a feature you can activate if you wish i.e. register for public beta testing. Otherwise leave most drivers out of it so they get regular updates but no A/P features until it’s out of beta. I think the real world feedback and testing of A/P has great value so an all or nothing approach is not very helpful.

I don’t think it is good to associate autonomous driving with electric cars because it is simply two different things completely separate.

For instance I want an electric car very much but I do not want a self driving car.
Self driving will always bring in the question of what would the driver have done.

I am all for improved safeties like ABS and automatic braking but that’s it for me.

Fasterthanonecanimagine

Actually each and every verhicle should constantly be sending a signal (maybe a different one from each corner of the vehicle)about its position (possibly relative to other cars and not GPS dependent) and all other cars around should receive the signal and be able to calculate their relative positions to each other. Similarily, traffic signs should constantly transmit their position and ‘meaning’ to the vehicles around, so if they are e.g. covered by snow, dust or hidden behind leaves, they would still be correctly recognized. This would be a great backup to GPS/visual/radar based detection systems. Hope an entrapreneurial spirit is picking up the idea to develop the standards.

Deepest sympathy to the family, sadly some times a very good man or intelligent machine cannot predict.

Tesla should forthwith change the name of this feature from Autopilot to Driving Assist Mode. As a judge I would say that the name Autopilot is misleading and suggests to the driver that there is a high degree of autonomy. This name could lead to lawsuits. A milder label can be still kept up to the introduction of full autonomous driving.

As a judge and I’m not asking you to litigate this but you did weigh in so… Where is the placement of blame with regards to the driver of the Tesla and perhaps the truck too? I’m not a Tesla fan boy, not looking to purchase a Tesla ($$$) don’t own stock nor work for the company but it seems our legal system is quick to investigate and in a lot of cases assess blame either partially or in full on companies. The perception being is that you don’t hurt a company with awards to the poor fellow who got hurt, companies are rich. Tesla has been very forward with what this technology can and cannot do. They have stated this IS NOT a fully autonomous system, regardless of it’s name and that the driver MUST remain engaged and aware at all times and be prepared to over ride the system should the need arise. The system even prompts you to do so. A few people on this site are willing to blame the manufacturer and in particular take issue with the name – autopilot – and claim or imply that people believe this to be a fully autonomous system… Read more »

While the death is unfortunate and as a number is insignificant, it was a matter of time.

The name of the system, the way Tesla is advertising it and the fact that the feature itself is not very regulated until this very moment has been asking for a trouble.

Hopefully, all the parties will get off their asses and do something about this status quo.

Why does it need to be “regulated”? One death in 130,000,000 miles is better than human drivers. Why does everyone beg for big brother to protect them from THEMSELVES.

People need to take PERSONAL RESPONSIBILITY for their actions. The system is very clearly beta, it’s not enabled by default, and enabling it warns you not to take your hands off the wheel. If you enable a feature that is beta, you are taking a risk.

That’s not to say that we can’t learn lessons here, but a big brother approach is not an acceptable solution, IMHO.

Always felt a country where you sue the manufacturer of your car when you hit the throttle instead of the brake pedal should not be your first choice to test such advanced features.

I despise these american ‘highways’ where people are doing speeds up to 80 mph (freeway speeds) and there’s some kind of crossing or joining road. Any road with speeds that high needs to be segregated with dedicated slip-roads being the only means of access.

I still want my model 3..with or without autopilot, I don’t care.

If I was the government investigator, I would look at the number of occurrences where the driver had to make a manual evasive corrective action in the 130 million miles of autopilot driving. That’s the true measure of how good the autopilot system is. You could directly compare that to the rate under manual only control where the driver had to make an evasive corrective action. If the rate of corrections is higher for the autopilot system, then it is not safe. If comparable or better, then it is acceptably safe.

Not sure if this is confirmed, but there is also the report of watching a movie (from the truck driver involved, so consider the source).

But – http://abcnews.go.com/Technology/wireStory/driving-car-driver-died-crash-florida-40260566

The truck driver claims to have ran over to Tesla after the accident and a Harry Potter movie was still playing, but he couldn’t see it and only heard it. In the Tesla driver’s previous YouTube video where he was almost side-swipped by a truck, the Tesla driver was listening to an audio book. Perhaps, he was also listening to a Harry Potter audio book during the fatal crash, and not watching an actual Harry Potter movie on the center screen.

The truck driver also claims the Tesla driving fast.

From the news article:
“Frank Baressi, 62, the driver of the truck and owner of Okemah Express LLC, said the Tesla driver was “playing Harry Potter on the TV screen” at the time of the crash and driving so quickly that ‘he went so fast through my trailer I didn’t see him.'”

“‘It was still playing when he died and snapped a telephone pole a quarter mile down the road,’ Baressi told The Associated Press in an interview from his home in Palm Harbor, Florida. He acknowledged he couldn’t see the movie, only heard it.”

I love my cruise control and I can’t wait for my model 3 with autopilot; yet I have almost 300k miles on my car and I can tell you that driving on the highway while listening to audio books can zone out a driver BIG TIME. Add autopilot to this, and the chance of an accident go up sky high IMHO.

Also, if you have ever listened to a HARRY POTTER audio book, they sound just like a movie because of all the character voices and such. Those audio books in particular can SUCK YOU IN!

I expect part of the issue is that in the USA, tractor trailers still do not have the side impact bars that have been standard in the EU for 20+ years. This means an autonomous system has to determine that the ‘bridge’ in front of it is too small to fit. The lack of these bars kill many drivers and passengers each year. Its time for the NHTA to step up and require better safety standards on trailers (as well as copilot software).

Trucking lobby is too strong in the US. It would cost them money to retrofit Semis and it is cheaper to hire an army of lobbyists and buy politicians.

The truck driver killed the oncoming motorist. This probably happens every day somewhere in the USA. Because it’s not a Tesla it doesn’t get reported… but the cause of death is the same – the truck crossing oncoming cars when it shouldn’t have been.

None of this solves the riddle of why the Tesla driver never pressed his brake pedal. He drove headlong into the side of the truck.

Yes hopefully the NHTSA will extend the safety requirements on trucks to include those side panels, just as they are required in other places in the world. It reduces pollution via reduced fuel usage, as well as helping cars to detect the truck.

INSIDEVS.COM PLEASE STOP USING THE WORD “INVESTIGATION.” IT IS NOT AN INVESTIGATION. It is a preliminary evaluation… stage 1 on the NHTSA’s scale of escalation. Stage 3 is what they call an investigation. Even Tesla’s blog WHICH YOU QUOTE calls it an evaluation. You are a car blog… not USA Today or Omni magazine. Don’t make these generalisations. We look to you for fair reporting. You can still change your own headlines.

“I’m not pregnant, I’m ‘gestating’!”

I think the the driver of the truck, who failed to yield to oncoming traffic while making a left turn, is something they will look at closely.

Also, in the realm of interesting timing. – The next major release of Autopilot V8 (with additional ramp exit features) is just starting.

http://electrek.co/2016/06/30/tesla-8-0-update-new-autopilot-features-ui-refresh-more-model-s-x/

From what I read, the problem was that the truck was stationary, and had open space underneath it, and AutoPilot basically percieved it as an overhead sign, or a bridge – Clearly a design weakness –

But given this was a 3:40 in the afternoon, or a clear day, good weather, and a fairly straight road, the driver should have seen the truck, and at least applied the brakes – Clearly, the driver wasn’t was playing attention to the environment, and was distracted, etc. –

So I put this one down, to SW/sensor design weakness (which I’m sure they will improve upon) – As we’ll as the driver having to much “blind trust” in the Autopilot

It’s really saddening anyone should be unfortunate enough to lose their life when driving. 🙁

But looking over this article it’s supposed this is the same guy who recently claimed Autopilot saved his life already. However looking at that video, it shows the camera is in my view obscuring his vision (why is placed on left edge of line of sight and not up behind the rear view mirror I have no idea). Also I clearly saw that truck pull into the lane next to him, sit there for a second and then pull over some more. He didn’t see this, Autopilot did. I wonder how much of a distraction his camera was (if still on his windscreen daily) and how alert he was with Autopilot engaged. Anything that allows you to relax when driving is potentially something that stops you being 100% alert and concentrating, this is a bad thing. Ask any F1 driver, the last thing they want is a quiet easy race because it leads to concentration lapses.

It will probably never be released, but we know this car had a dashcam. I want to see it.

Do you really want to see a Tesla driver being beheaded by a steel beam cutting the top of his car?
Those images, if available, should not be released and erased after the investigation.

Perhaps Tesla or any Auto manufacture can integrate similar product such as Furious S8 – A 8 cam dash cam (shown below), they could have figured out what went wrong (the culprit). There needs to be a way to capture/record the surroundings to ensure whatever new technology are safe and ensure if the incident is caused by human error or the technology. Condolences to the family, and RIP Joshua!

https://youtu.be/dTTP5SKc1Fk

https://youtu.be/b9K6HmCb3Bg

https://youtu.be/JGkQzWfbW3s

Looks like some “government overreach” is in order to implement side guards on tractor trailers, like the Europeans, so as to prevent ‘submarining’ like this car did. Could likely save a lot of lives.?

Not blaming Tesla here but this is a pretty freak accident:

The roof is sliced off. I don’t think the airbags even deployed. I wonder if the guy was decapitated.

Tesla kind of infers the airbags did not deploy with this statement:

“Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system (read airbags) would likely have prevented serious injury as it has in numerous other similar incidents.”

Looks like it was a clean slice and there wasn’t enough deceleration to make the air bags go off.

Tesla is off the hook on this one IMO.

….but they need to fix their LIDAR to read a little higher