IIHS: Fatal Tesla Crash Proves Partial Automation Is Risky

AUG 8 2018 BY DOMENICK YONEY 109

Autopilot isn’t perfect.

David Aylor has a job with some pretty interesting perks. Recently, the Insurance Institute for Highway Safety (IIHS) manager of active safety testing has been putting miles on a Tesla Model S in an attempt to gauge the effectiveness of its Autopilot advanced driver-assistance system (ADAS). He’s found that the system has some flaws, though, if you’ve been following coverage of a handful of high-profile crashes involving the Tesla vehicles being operated on Autopilot, you likely already knew that.

In particular, Aylor points to the system’s occasional failure to properly handle road splits. It seems that there can be some confusion as to which lines to follow, and if drivers aren’t paying attention, it can lead to a collision. This appears to have been the case in one of the most famous of incidents: the crash that claimed the life of Walter Huang. That incident is outlined in a recent report about autonomous vehicles. Huang’s Model X, with no hands detected on the wheel, seems to have steered itself into a highway divider on Highway 101 in Mountain View, California.

Aylor also brings up a similar incident that was filmed by a driver in Chicago testing for this very situation not long after the Huang crash. In that case, the video of which we’ve embedded below, the car doesn’t seem to know which set of lines to follow, and the driver has to intervene, braking just in front of the gore point.

These incidents, IIHS says, are evidence of the risk that partial autonomous systems can pose. Despite the fact that they found Tesla Autopilot can reduce injuries and damage claims, it is also true that it’s not a perfect system and drivers need to be alert and ready to take control if it runs into trouble. And the problem isn’t limited to Tesla.

The report notes that vehicles from other automakers equipped with Level 2 ADAS systems have also been involved in crashes. Those occurrences, however, haven’t made headlines like those involving the Silicon Valley company. For whatever reason, none of those incidents made it into this particular report either.

The report doesn’t offer much in the way of analysis, but the reason for the danger seems clear. Drivers, used to a system that works perfectly a very high percentage of the time, can be caught off guard when suddenly it experiences difficulty. Tesla has addressed this by making the system give more frequent reminders if it detects a driver’s hands aren’t on the wheel.

While Tesla owners using Autopilot will always have to pay attention, the system is getting better. As more improvements that will eventually become part of its “full self-driving” (FSD) feature are implemented, it remains imperative, perhaps even more so, that drivers keep their eyes on the road and hands on the wheel. It seems reasonable to assume that as situations that require driver intervention become rarer, drivers may be less prepared for them.

Source: IIHS

Categories: Crashed EVs, Tesla

Tags: , , ,

Leave a Reply

109 Comments on "IIHS: Fatal Tesla Crash Proves Partial Automation Is Risky"

newest oldest most voted
BoltEV (was SparkEV)

Partial automation at 55 MPH is risky. Partial automation at 12 MPH (most of Los Angeles freeway speed limit) is very safe. Have the system disengage as soon as speed picks up, and it’d be fine, allowing me to read IEV instead of staring at “moron on board” sticker.

bro1999

Slow and steady wins the (autonomous) race. That’s why GM/Cruise is going with low speed AVs in urban environments first. Gotta learn to walk before you run like Usain BOLT. 😀

BoltEV (was SparkEV)

Urban is harder than 55 MPH on freeway. Have “LA freeway” mode, because that is the easiest; no pedestrians, no bicycles, no nothing except for car in front, yet speed is only 12 MPH.

MDEV

Driving in DC at 12 mph is way dangerous with crazy bakers, electric for share scooters that don’t follow traffic lights or stops, drivers are insane too. So I take the freeway at 55 mph.

MxHRH

Don’t forget the stopped FedEx vans on K st & busses that seem to be driven by the blind — they treat lanes like they are just mere suggestions 🤦🏾‍♂️

Doggydogworld

Uban is harder, but mistakes are generally less deadly.

Except to non-GM customers, aka pedestrians and cyclists, of course 🙂

MoMac

Nice try, but it was an UBER self-driving test car (with safety driver) that hit and killed a pedestrian walking a bicycle across the road.

I haven’t heard of GM Cruise hitting pedestrians or cyclists.

Kbm3

Just love unsupported opinions.

Let me try.

Going fast is good in war, but bad in love. You are smart to play tennis backwards.

Gee this is fun.

There’s no accountability and I can say whatever pops in my head.

Mark.ca

Bahahaha!
Good one. It’s true, the speed limit on our freeways is “regulated”.

Texas Leaf

You opted to buy a car without an AV system so I have to assume you will say anything to justify your decision.

BoltEV (was SparkEV)

Which car allows taking hands off the wheel and get in the back seat to read IEV? Heck, which car allows hands off at all (legally)? Don’t confuse your primitive Nissan “drunk driver assist” with what I’m talking about.

MoMac

Cadillac CT6 with Super Cruise is the only car that one can buy for personal use (that I know of) that allows hands-free usage. But eyes have to be watching the road ahead since that is what the system looks for to ensure safety.

BoltEV (was SparkEV)

Is that legal? But regardless, must have eyes on the road defeats the purpose since I still have to stare at the stupid bumper stickers instead of reading or typing (aka, software engineering)

Scott Franco

Disagree. I don’t need a speed limiter, nor artificial engine sounds, nor training wheels. If you need them, you get them. Make speed restriction an option, and YOU can turn it on if you like.

Asak

Have fun beta testing with your life.

Michael Will

Partial automation up to 90mph is less risky with teslas autopilot than not using automation at all. I have driven AP1 since 2016 and AP2 since 2018 and for certain it is more safe with than without. It’s a driver assist feature so you delegate the micromanaging of distance and lane keeping and when you advice it to by setting the turning signal, it switches the lane for you: you are still fully in charge of what is happening, looking ahead and mode far back than you could before in order to understand and project what the appropriate action is. I would never buy a car without it again.

Bill Howland

You wouldn’t think it would be the case, but apparently much of the problem is the name ‘AUTOPILOT’ itself.

I’m sure Tesla owners like to think of themselves as “Smarter than the average Bear” (aka YOGI), but these otherwise intelligent people (smart enough to afford the down payment on one, at least) end up in deadly collisions since they must have thought that the system is more of an actual autopilot than it is.

I don’t see much point in any system that requires constant attention and also control of the steering wheel. To me, it is harder to drive than simply doing it the old-fashioned way, since it would be more of a surprise to me if I suddenly HAD to take control without notice. At least there are no additional safety surprises FROM THE CAR when driving yourself. In that sense, it is obvious (to me at least), that such systems are in fact more dangerous than simply driving yourself.

Now systems that jostle you a bit if you are day dreaming or falling asleep – those systems obviously increase safety.

Mark.ca

There are uses for systems like AP. LA traffic is horrible and AP does wery well in it. It’s at high speeds that i would feel uncomfortable using it. Tesla made a mistake of not stepping in sooner to address and make it clear what AP is for. Having the info in the manual is not enough when everyone on the web and tv talks about it as if a real self driving system. Keep the name, just make sure people understand it’s not to be taken literally.

Asak

I think Tesla is speaking out of both sides of is mouth and fully intends the public to believe it’s partial self driving is more capable than it is. Even current Teslas are being touted as being eventually capable of full autonomy. The feature is over promoted with what effectively amounts to an asterisk in the manual.

MDEV

If you think that a name is the reason not to follow what the car manual advice to do, you are impaired to drive.

Bill Howland

I’m not impaired. But the people who spent what?, $5000 for this system and died I guess you are calling impaired.

Robert Weekley

Average cost to learn to Fly, for a Beginners Private Pilots Licence, allowing you to take a Passenger, can exceed $10,000 today!

Autopilot Training would not likely be included! For that you would need advanced training, like Instrument Rating, or Commercial Pilots Licence!

Numbers of such Trained and Current people, in the USA, is likely less than the number of Tesla’s sold there! Hence, I conclude that most Tesla buyers are not qualified to have a personal understanding of what an aircraft autopilot limitations are, and are OK to make money, but Life & Death Decision Making is not their daily life habit.

So, Bill, Impaired, might be a strong choice of words, but since their understanding is limited, it might not be wrong.

Bill Howland

That’s a Red Herring.

No one is suggesting the $5000 option allows you to become SULLY.

Asak

Who the hell even reads the manual? Seriously?

Bill Howland

That is just Stupid. Or perhaps people were smarter in 1958. They also had “Electric Eyes”, but no one in 1958 would be dumb enough to give an ‘electric eye’ an EYE CHART TEST, since it was obvious the only thing it could do was sense a beam of light being broken to open an automatic door – such as the safety on a modern day garage door opener that re-opens the door should the ‘electric eye’ ‘see’ its beam of light being crossed by an obstruction or small child.

The Chrysler cruise control thing, and also other gadgets such as the Autronic Eye, and whatever else they had usually just dimmed the headlights to an oncoming car.

This is far different from Tesla buyers, thinking the car has ‘Super Advanced Technology’ (not that much of an advance in reality), to think that when ‘Plain-Speaking’ Tesla says they have an “Autopilot”, and charge a HUGE amount of CA$H for it, that some buyers really think that is what they are buying.

The fact that I would NOT have made that mistake does not take away from the fact that people died believing it.

antrik

Nah, the known cases of people dying on Autopilot seem to be documented as people actually being well aware that it’s not self-driving — but becoming too complacent in spite of that. Different naming wouldn’t have helped with that.

BTW, calling cruise control “autopilot” is actually an apt comparison — actual autopilot on an aircraft doesn’t really do much more. (Not even as much as traffic-aware cruise control; much less autosteer.) Of course that’s not very helpful, since the vast majority of the population doesn’t know that…

Bill Howland

That’s total nonsense. If people didn’t believe in the system they wouldn’t take their eyes off the road.
People get used to driving ‘non-AV’ cars all the time, but only idiot kids doing texting are the ones lately who get complacent, but then they’re to dumb to realize how much damage such idiocy can cause, like the dumb woman who plowed into the back of my BOLT ev without even touching the Brakes back in early June, while texting on her phone.

When she gets the new insurance premium, if she’s still allowed to drive, or, more to the point, can still afford to drive anything.

antrik

Tesla claims that according to their statistics, despite the problem with people becoming too complacent, Autopilot still improves safety on the whole.

Asak

Tesla claims a lot of things. Like any other company you should take their claims with a serious grain of salt.

Roy_H
Autopilot is still a work in progress, and Elon keeps telling us it will be perfected 6 months from now. So far they have done a remarkable job of being able to follow car in front, reading speed limit and stop signs, warning or preventing turning into the adjacent lane if there is a car there. They have not solved the problem of detecting stationary objects in the road ahead. Teslas have run full speed into a parked fire truck, lamp post, semi-trailers, and cement road divider. No attempt to slow down at all before collision. Tesla has emphasized that the driver must remain alert and able to take over in an instant. I think the best way to ensure the driver is alert, is to not let him use Autopilot. Tesla uses an array of cameras, sonar and front radar to detect objects around it. Based on the above accidents, it appears there is heavy reliance on the radar and little or no reliance on the cameras to determine distance to objects ahead. Radar works best when its signal can be reflected back by a metal surface. It does not work well on humans, wood, plastic etc. It does… Read more »
Loboc

All radar-controlled cruise control systems PURPOSELY ignore stationary objects. This is to avoid the many false positives (ie. panic stops) when the system detects something like a wall on a curved piece of road or a beer can in the street. IOW, it’s not that the car cannot detect a stationary object, it’s the software ignoring it.

BoltEV (was SparkEV)

Are you sure about this? If so, Tesla is using pretty crappy (aka, basic) radar. I expected better.

Pushmi-Pullyu

This is precisely the problem: That people expect Tesla’s cars to be better than other cars at detecting stationary obstacles when moving at highway speed.

They’re not. They’re also not worse.

BoltEV (was SparkEV)

There are better radars that have vixel resolution approaching 1cm. I’m surprised everyone’s still on basic stuff. If I’m not always stuck in damn traffic, maybe I could produce something useful in this regard.

antrik

AIUI the problem is not resolution — it’s that radar provides a one-dimensional view. Having a more precise picture of how far the object spans horizontally, doesn’t really help in determining whether it’s a real obstacle.

BoltEV (was SparkEV)

You’re talking about vixel resolution that’s essentially infinitely large (ie, basic, what I call crappy). It’s hard to believe everyone’s still using this 1930’s technology.

nix

The problem is in having the compute power to process higher resolution. Tesla just announced a new hardware chip for this task with a10X improvement over previous hardware.

Scott Franco

BS. I tried it. I choose a vacant street with one parked car, and a 35 mile an hour limit. Aimed at the parked car, the idea being that if it didn’t slow down, I could easily pull left and avoid the car.

Result: the M3 slowed and stopped.

Please stop repeating this rumor.

Loboc

That’s automatic emergency braking not radar cruise control. AEB only works at lower speeds (like 35mph or less . Depends on the manufacturer.) ACC/TACC works at highway speeds and only detects cars moving in the same direction. Both ignore stationary objects at higher speeds.

Scott Franco

Wrong again. This was with EAP engaged. I am not STUPID I know the difference.

Pushmi-Pullyu

It’s not a rumor. Cars with ABS can and do detect stationary obstacles when moving at low speed. But the detection is shut off at higher speed — at highway speed — because of the problem with too many false positives.

Autonomous and semi-autonomous driving is more complex than you realize.

See: “Why emergency braking systems sometimes hit parked cars and lane dividers”

https://arstechnica.com/cars/2018/06/why-emergency-braking-systems-sometimes-hit-parked-cars-and-lane-dividers/

antrik

You are mixing up ABS and AEB, again…

Rafael Sabatini

ABS is “anti lock braking system”.

I’m not sure what that is in Russian, so it may be a language translation error, Pushy-Pugme.

antrik

Actually, I believe the *real* meaning is the German “Anti-Blockier-System” (it’s a German invention) — the English one is just a backronym as far as I can tell 🙂

Asak

Exactly why these systems are not ready for prime time.

Pushmi-Pullyu

“They have not solved the problem of detecting stationary objects in the road ahead. Teslas have run full speed into a parked fire truck, lamp post, semi-trailers, and cement road divider.”

Reality check: No semi-autonomous driving system in any mass produced car even attempts to detect stationary obstacles when driving at highway speed. It’s merely that the mass media ignores accidents involving semi-autonomous systems in cars hitting stationary obstacles, if those cars are not Teslae.

See: “Why emergency braking systems sometimes hit parked cars and lane dividers”

https://arstechnica.com/cars/2018/06/why-emergency-braking-systems-sometimes-hit-parked-cars-and-lane-dividers/

antrik

If binocular vision was so crucial, people lacking it would never be able to obtain a driver’s license…

I can tell you from experience that it does seem to be somewhat of a handicap while playing volleyball for example, but otherwise is rarely even noticeable in everyday life.

Kdawg

Looks like roads need to be painted better, or Autopilot needs to disengage when it’s confused or can’t detect both lines.

Mark.ca

Disengage is what other systems i tested do when the lines are not well defined. Do you have AP on yours?

Kdawg

I didn’t get autopilot. If i drove more on the expressways every day, I might have.

Loboc

The nVidia chip set can already drive on a dirt road. Lines are not needed for FSD. Better software is needed.

Musk still claims that FSD is an incremental upgrade from AP. This is not easily done without a full system redesign. Thus GM and others have two separate design efforts. Full autonomy and driver-assist. It may not be possible to improve driver assist to be fully autonomous. It is probably possible, although overkill, to detune FSD to driver-assist.

Scott Franco

Tesla and Mobileeye got a divorce. We saw a demo after they joined Intel. Very impressive. Tesla needed to make up lost ground.

Windbourne

mobileye is a rule based system. As such, it will never fully handle the edge cases

Dav8or

The roads don’t need anything. Autonomous driving tech just needs to be better.

Pushmi-Pullyu

For the most part, yes. Autonomous cars need a fully developed SLAM* systems, and that apparently hasn’t been developed yet. Even Waymo/Google’s fleet of fully self-driving cars are limited to 25 MPH or less.

It’s not merely that Tesla’s cars need better sensors — lidar or high-res radar — it’s that, like all semi-autonomous cars, they need much better and more sophisticated software, capable of handling real-time detailed scanning of the environment 360° around the car, at both low speeds and freeway speeds.

Looks to me like Elon is being very, very over-optimistic about Tesla being close to that. They’re not months away, they’re years away… just like everybody else.

Where roads will need modification is in construction zones and other areas of similar confusion. They’ll probably come up with some sort of marker, either fixed or portable (like construction cones) which will properly direct self-driving cars to the safe path.

*SLAM stands for Simultaneous Localization And Mapping technology, a process whereby a robot or a device can create a 3D map of its surroundings, and orient itself properly within this map in real time.

Kdawg

I dunno, I see humans get confused by lack of lines on roads. Must be a reason they keep painting them.

Scott Franco

“Looks like roads need to be painted better, or Autopilot needs to disengage when it’s confused or can’t detect both lines.”

Yes, plus we need to cover everything in foam like when you are childproofing a house.

Kdawg

So roads don’t need lines? Guess we wasted a lot of money in that area. Should get rid of those stop signs too, since everyone knows you should just stop right?

Asak

The issue is the real world is imperfect. A self driving system has to be able to handle less than ideal conditions because it’s going to encounter them. A human can handle a stretch of road with missing or ambiguous lines. If a car can’t do that is a step down in ability from a human drive.

MDEV

Agree local government and federal routes has the responsabity of paintings the road with the right signs.

Texas Leaf

My Leaf with ProPilot always gets confused by off ramps. That’s why I prefer to not to drive in the right most lane if there is more than one lane. If there is an off ramp and I’m in the right lane I always pull the wheel a little to left until I get past the off ramp.

Tummy

There is a particular exit near our home with very confusing lines due to construction the last few years. I’ve seen many people end up in the median. The lines appear to lead directly into the crash guard median like the photo, but you are suppose to stay right to exit the highway correctly. I think some of the responsibility is on the government to correctly maintain the painted lines. Even humans can get confused by lanes that just disappear or not well marked.

Scott Franco

As frequently suggested, Tesla should name this “driver assist”, not autopilot. In any case the definition is:

a person who operates the flying controls of an aircraft.
synonyms: airman/airwoman, flyer;

Last time I checked, Teslas don’t fly. A distinction without a difference? No, generally airplanes don’t navigate around obstacles (just few bits of granite carefully placed inside clouds).

The Tesla assist program is relatively conservative, does not attempt nor pretend to “do it all”. Using the system for about 5 minutes should convince any reasonably intelligent person that the system operates correctly supervised, not given all trust and faith. Persons believing the latter have an appointment with Professor Charles Darwin.

(⌐■_■) Trollnonymous

The Nissan Leaf doesn’t fly either with it’s it’s “Pro-Pilot”……….lol

Loboc

My ELR has pretty good ACC, but, it still gets confused easily. Like when a car enters or exits the lane in front of me. On some occasions in stop-n-go traffic, it will completely lose the ‘lock’ or detection of the car in front of me. So, when you resume it speeds up like nothing is there!

It only has assisted lane detection not active lane control, so, I don’t need to worry about the car actually crashing into anything. I got that part on me.

You only have to use the system a few weeks to know about the most likely corner cases where the system can’t figure out what to do. These limitations are fully covered in the manual if the driver cares to read and understand it.

Tesla’s system appears to instill more confidence with the driver and thus has more significant issues when the driver is not paying attention.

Scott Franco

Sure, and the point of the article is that the “more confidence” factor is going to just get worse. But the logic of “we should oughta get this stopped if it is not perfect” is to me the issue. Based on that we should outlaw cruise control. Stupid people are always going to have issues with tech.

The same thing is going on with airplanes right now. “there is too much tech on airplanes, and pilots are getting distracted”. No, there are pilots who get ontop of the tech and train, and there are pilots who are woefully behind it. Even the fact that glass cockpit airplanes are outrageously expensive does not help. Brand new pilots rent them and in fact some training centers are pushing the idea that they are safer. Darwin always has and apparently always will weed out budding Kennedys from the pilot population, much to the shame of the loss of supermodels and loss of organs, since airplane crash victims make notoriously bad organ donors.

Nix

Seriously, at this point with all the coverage that Autopilot has gotten, if people still haven’t figured out what the name means, I don’t think changing the name is going to matter.

If anybody disagrees, please post your name in response to the post and say “I don’t understand that Tesla AutoPilot is a driver assist and not fully autonomous driving”

Windbourne

The ‘driver’ of a tugboat, u-boat, ships are called pilots.
Calling this AP is not a big deal and it is silly that ppl make such a stink over it.

Bill Howland

I guess the fact that people have died because of it is of no importance to you.

antrik

If you know of a case where someone has actually died because of the naming, feel free to share it…

Bill Howland

Its perfectly obvious – they wouldn’t take their eyes off the road otherwise.

antrik

Yeah, sure. They take their eyes of the road because it’s called “Autopilot”. If it wasn’t called “Autopilot”, they wouldn’t take their eyes of the road. OBVIOUSLY.

nix

What about the people who have died because they DIDN’T have autopilot who would have been saved by autopilot? Are they of “no importance to you”?

Your logic is the logic of people who don’t wear seatbelts because in some small number of corner-cases a small number of people died in accidents because of seatbelts that were otherwise likely survivable. All because they can’t do the math and figure out the number of cases where seatbelts save lives greatly outweigh the rare instances where seatbelts contributed to a fatality.

Bill Howland

I’m still waiting for you and your buddy Pushi to tell me where I have ’embarrassed myself but I don’t see it’ on the supercharger issue.

Bill Howland

The “s” and “x” have atrocious safety records as compared to other manufacturers of ev’s. If everyone owned expensive 2 year old Teslas there would be far more fires and explosions than there are of 2 year old gasoline powered cars. So talking about Tesla and safety is rather unbelievable to me lately.

The roadster was a pretty safe vehicle. Perhaps the ‘3’ is also due to a different battery, but time will tell.

Nix

The accurate comparison would be to other like powered/performance vehicles, with similar like drivers. For example, the guy who stole a Tesla and was doing 100+ mph on a city street being chased by police when he wrecked and started a fire. This is not typical, yet is included in the Tesla stats.

This is the difference between the safety of the VEHICLE, and the safety of the DRIVER. You are attempting to conflate the two issues, and try to make it sound as if the VEHICLE had a safety problem where it does not.

This is the same difference between abstinence preventing pregnancy (99+% effective), and couples who say they use abstinence as a METHOD of preventing pregnancy (only 65% effective statistically). Yes, it is the fault of pregnant couples when they rely upon abstinence as their only birth control method, and they get pregnant. Same for drivers who push up accident statistics because of their unsafe driving.

Bill Howland

More blather and deflection of irrelevancies by someone who just likes to hear himself talk, except when asked a pointed question that you, the Superdope, ignore along with pushy Pushi, on silly accusations both you guys have made. Then you both go totally silent.

Scott Franco

Little off topic, but today I had a funny one when on AP in the fast lane. I did an auto lane change right to get out of the way of a car that was tailgating me, then auto lane change back behind him/her. The car got about a foot left, then suddenly changed its mind and went right again. I disconnected it. Never saw what spooked the AP, the left lane was completely clear.

Jeff

I’m not sure how this is “autopilot” . If the driver entered a destination, it should know whether to go left or right at the split.

Pushmi-Pullyu

Correct; it’s exaggeration and hyperbole to call Tesla’s semi-automated lane-following system “Autopilot”.

But then, other companies have:

Nissan/Infiniti ProPilot
Audi Traffic Jam Pilot
Mercedes Drive Pilot
Volvo Pilot Assist

Seems to be a trend, and it’s also a trend for singling out Tesla as doing something “bad” or “wrong” or at least “strange” when following industry practices. That’s the effect of all the Tesla smear campaigns from short-sellers and Big Oil shills.

Actually, I like the term “Pilot Assist”. Unlike “Autopilot”, it doesn’t suggest a “set it and forget it” system to the average person. The average person does not understand that airplane pilots are trained to keep alert, and “keep watching the skies” even when an autopilot is turned on.

antrik

That’s not what an autopilot does at all. It literally just keeps a pre-set direction and flight height.

Unfortunately, few people realise that — which is what makes the Autopilot naming confusing to the masses…

CCIE

There are aircraft autopilot systems that can navigate and even land aircraft. Things have some a long way since the 70s wing levelers.

antrik

Are there? Perhaps; but that doesn’t seem to be standard equipment, going by a commercial airliner simulation I have seen not that long ago…

(I would also think they’d use different terminology, to avoid confusion with the original meaning… But that’s just a guess.)

DL
Like i said before, (and I am a machine learning person by trade), the biggest problem is not the degree of accuracy of the model, but the fact that humans are trained not to pay attention by the system. No amount of warning or conscious will is going to fix that. Such is the nature of humans, as demonstrated by numerous studies. Show a human heads 100 times (let along 10000 times), and he/she will have 100% belief the 101 try is also going to be heads (even if his/her life literally depends on it, as it happens here). But it doesn’t particularly matter that the system is capable do the right thing 10000 times or 1000000 times in a row, there’s only 1 life at stake. Which means, auto steering has to be either fairly bad (more than 1 mistake in 100 experiments) so it doesn’t train humans into trusting, or super good (reliably at least an order better than an undistracted, fully focused, able-bodied driver, on par or perhaps better with aviation safety per mile). The worst of the situation is about where it is right now. The comparison Musk makes to human average accident rates also does… Read more »
TwoVolts

“The worst of the situation is about where it is right now”

DL. I completely agree with your analysis. I also am not interested in EAP/FSD for the reasons you outline. When FSD is ready and can achieve the promised order of magnitude reduction in serious accidents/ fatalities, I will be on board.

DL
A bit of continuation. I finally got to look thru this often cited NHTSA report just now. A few quotes from it: “An unreasonable risks due to owner abuse that is reasonably foreseeable (i.e., ordinary abuse) may constitute a safety-related defect.” “Driver misuse in the context of semi-autonomous vehicles is an emerging issue and the agency intends to continue its evaluation and monitoring of this topic,” “While drivers have a responsibility to read the owner’s manual and comply with all manufacturer instructions and warnings, the reality is that drivers do not always do so. Manufacturers therefore have a responsibility to design with the inattentive driver in mind ” So what does it tell me? (1) driver distraction (for whatever reason, including due to building automation trust) is known and at least some of it is considered to be foreseeable; (2) It is a defect if foreseeable risk is not adequately planned for. Manufacturer cannot hide behind “manual told you so”. In other words, if one can successfully argue that the system trains to lose attention regardless of manuals etc., one can claim it as a defect. For example, this study https://reader.elsevier.com/reader/sd/AA205C1FD8A74B3C5931DE089624F7A160F23C90A633C829971379F711CF6DD09DA791AB521CF7CDEDEB75E191583E16 shows that “promoted trust” (e.g., the “AP” name could… Read more »
nix

(1) Driver distraction is a well known safety problem with ALL cars. See the spate of crashing while texting accidents. Autopilot actually SAVES the driver and other people on the road when the well documented growing problem of distracted driving would otherwise end in an accident. Especially a potentially deadly one with airbag deployments. The rare corner case is the exception that proves the rule.

(2) It is entirely foreseeable that drivers will rely upon airbags too much, and not wear seatbelts. (This is a well known phenomenon with a subset of drivers.) Car makers are not responsible for drivers failing to use their seatbelt.

(3) “A safety-related defect trend has not been identified
at this time and further examination of this issue
does not appear to be warranted. Accordingly, this
investigation is closed. ”

This is boilerplate language that NHTSA uses to close cases when they didn’t find any problem. You are misconstruing that statement if you think that somehow nullifies the report’s 40% conclusion. Variants on that language appear over and over, for example: https://static.nhtsa.gov/odi/inv/2010/INCLA-PE10020-9640P.pdf

But hey, thanks for finally skimming the report.

DL

You miss the point about significance. 40% without confidence bounds tells me nothing. If I have 7 fatalities one year and 5 fatalities in another, it is 40% decrease. However, there’s still at least 29% chance that they actually increased rather than decreased (let alone decreased by at least 40% or more), given enough sampling.

And in the same hypothetical exapmle, chances that they are decreased by less than 40% (or increased), cumulatively, are 61%. Which is more likely that not w.r.t. face value of claim “decreased by 40% (or more)”.

But they were set to prove the opposite (which is why they stated they were not looking to prove enhancements). They were looking for significance of defect. In the hypothetical example above, they would of course have found there is 29% chance of defect. I don’t know their standards w.r.t. defects, but normally it would have to be at least 80% (95% in less crucial investigations). So that’s what they are saying: the collected evidence is inconclusive w.r.t. to defect existence. (cf. “conclusively no defect”).

Pushmi-Pullyu

Sadly, this does not in any way address the issue of relative safety. The correct question should never be “Will this autonomous (or semi-autonomous) driving system make you perfectly safe?”, because complete safety when hurtling down the highway at 60+ MPH in close proximity to other vehicles also hurtling down the highway at similar (or sometimes faster) speeds, can never be perfectly safe, because… physics.

The correct question, which sadly this article completely fails to address, is this: Are you safer using Autopilot + AutoSteer, or safer not using it?

According to statistics from the NHTSA, Tesla cars with Autopilot + AutoSteer merely installed — not even necessarily in use — have a 40% lower rate of serious accidents, as measured by airbag deployment. That very clearly implies an even lower accident rate when AutoSteer is engaged! …which is a fact that the Tesla Death Cultists try very hard to confuse and obscure.

(continued…)

Pushmi-Pullyu

(…continued from above)

The lower rate of airbag deployment points very compellingly to a significantly lower rate of serious accidents, and therefore a significantly lower fatality rate.

“The thing to keep in mind is that self-driving cars don’t have to be perfect to change the world. They just have to be better than human beings.” — Deepak Ahuja, CFO of Tesla Inc.

DL

>> According to statistics from the NHTSA, Tesla cars with Autopilot + AutoSteer merely installed — not even necessarily in use — have a 40% lower rate of serious accidents, as measured by airbag deployment. That very clearly implies an even lower accident rate when AutoSteer is engaged

The problem with the statement is Tesla sample is likely significantly _biased_ against the overall population one. (or was it against tesla vehicles without AP installed? Not clear to me). As any pollster would know, removing bias from sample is incredibly hard, and that it is one of the easiest ways to “lie with statistics”. (I don’t believe statistics lies in mathematically ideal situation, but i do believe people apply assumption relaxations improperly way too often to fit their goals). It is like saying that school dropout rates per race suggest differences in abilities to learn, all the while ignoring that race-based samples will also be biased by income and 100s other variables against the overall.

nix

A significant size of their sample set was cars that were originally built with Autopilot hardware installed, before Tesla first enabled AutoPilot via OTA. The airbag deployments were based on black box data. There was no “pollster” involved, and Tesla wasn’t the ones who analyzed the data, it was NHTSA. Are you saying NHTSA are liars?

Are you saying we should ignore data collected by black boxes on the same cars with the same drivers on their same daily driving patterns, where the data was collected both before and after Autopilot was added to these cars via OTA? Frankly, I don’t think it is possible to find a more statistically meaningful data set that corrects for variables.

DL
I don’t think NHTSA are saying as much as people are safer by using AP. They give the numbers (without even error bounds i would guess). I don’t know the details of this data collection, the original phrase sounded like the “40%” was comparing to average rates, not necessarily Tesla w/o AP rates. If we are comparing within Tesla population itself, that would be less of a concern (although “engaged” vs. “installed” being even less looks like a conjunction to me). I know Elon kept comparing AP to average population rates, not within tesla population. here, just quick google and this article comes on top: “Statistically, experts say Musk’s tweet analysis isn’t valid. While Teslas could have a lower death rate, it may speak more about the demographics of Tesla drivers than it does about safety of the vehicles, says Ken Kolosh, manager of statistics for the National Safety Council.” https://www.apnews.com/be561d432303431695f64b5c158dbf7c Exactly what i was saying about bias in my post above about stratum. Musk claims are statistically naive and suffer from fallacy of averages. That said, i admit I haven’t looked into details of nhtsa AP data and did not have more detailed analysis of my own what it… Read more »
nix

Go read the original NHTSA report showing the Tesla Autopilot 40% statistic. It is actually very easy to find in the insideev’s archives.

It really isn’t worth debating you until you go read the original NHTSA report, because it has page after page of very valuable info that you need to understand before trying to armchair quarterback NHTSA’s findings.

TwoVolts
From NHTSA report: “Crash rates. ODI analyzed mileage and airbag deployment data supplied by Tesla for all MY 2014 through 2016 Model S and 2016 Model X vehicles equipped with the Autopilot Technology Package, either installed in the vehicle when sold or through an OTA update, to calculate crash rates by miles travelled prior to and after Autopilot installation. Figure 11 shows the rates calculated by ODI for airbag deployment crashes in the subject Tesla vehicles before and after Autosteer installation. The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autopilot installation.” This suggests that the current AP system is helpful. However, I would point out that AP’s apparent success as described here relies on responsible drivers intervening and ‘saving the day’ at times. Model S and X owners are probably the ‘cream of the crop’ when it comes to being attentive and responsible. Such owners represent the ideal human partner for an imperfect driver assist feature. If a similar AP system were offered to the masses on lower end models, I would not expect safety crash performance to improve by 40% – but rather likely get worse.
nix

I’m not sure I agree with your analysis that people who will pay 100K+ for a performance car that will do 0-60 in 3 seconds represents the group of drivers who are most focused on safety over all else. Those seem more like people who would accept a higher level of risk, not people who are risk adverse.

But thanks for the data.

antrik

Accident rates with Autopilot installed but not engaged only reflect the effectiveness of the always-on safety features. (Automatic emergency braking, collision warning, side collision avoidance…) To know how well the actual Autopilot features work, we need to compare accident rates between Autopilot engaged and Autopilot installed but not engaged.

antrik

Also, please stop splitting posts. It’s annoying.

nix

“Other manufacturers’ Level 2 vehicles likely have been involved in crashes while drivers were using advanced driver assistance features, but none of them have grabbed headlines like Tesla.”

–IIHS story

This brings up the question of why IIHS isn’t doing the same testing on those vehicles too? I generally expect the IIHS to do more than just chase flashy headlines. This would be like National Center for Missing & Exploited Children only trying to find missing children who get lots of press coverage.

James

Automation is getting better all the time. This is why I put my Autopilot dollars into Dual Motors instead.

Talking to Model 3 owners on the street, I feel good about my decision. Seems the regular cruise control has a spacing feature good for gridlock stop n’ go driving – probably where I’d use Autopilot most, anyway.

Most M3 buyers get Autopilot. I’ll wait and assess and get the latest iteration of it later.

EVShopper

“Partial Automation Is Risky” well, yeah. In other news, water is wet.

antrik

I think it would help to spread the metaphor suggested by “The Money Guy”: it’s like a twelve-year-old holding the steering wheel. Even if it seems to be doing well most of the time, that’s no excuse to stop watching it.

With that metaphor planted firmly in their minds, people should be less prone to falling into the complacency trap I believe…

AJ

Partial automation would be fine if these cars’ autopilot systems worked flawlessly, and you could simply take over whenever you felt like driving, rather than when you needed to or risk crashing. But when these cars aren’t entirely flawless, I think partial automation is dangerous.

nix

The problem with your argument is that drivers WITHOUT autopilot systems would be fine if their driving “worked flawlessly” without a driver assist. But all the evidence points to the contrary. Drivers clearly “aren’t entirely flawless” without driver assist, and very much are proven to be dangerous with decades of automobile fatality records to prove it.

Driving without automation would be fine if these people’s driving worked flawlessly, and you could simply rely on them to avoid accidents with their driving, rather than them always being at risk of crashing. But when these drivers aren’t entirely flawless, they are dangerous without partial automation helping catch their errors.

You are making the same mistake of flipping the statistically proven cause of millions of accidents, which is overwhelmingly driver inattention WITHOUT driver assist, to blaming the rare corner case of a handful of drivers have been distracted at the same time driver assist wasn’t “entirely flawless”

Dan

And Tesla got away with this? Pathetic!

Rafael Sabatini

Well, again, the human factors research argues strongly against this approach, at least until the Defect Rates are 10X those of a human drivers. As designed it encourages complacency. SuperCruise is undoubtedly better, but comes with its flaws as well.

Vicsha

What are the news here? This seems a comment of that past accident. Is there any new test or accident to add to the news or to the analysis that have been done in the past?
And why a single incident proves anything? Since when analysis is done with anecdotal evidence. Also, risky compared with what? Does that means Tesla autopilot has proportionally more accidents? Where is the data?

Ian Mendoza

These cars should not be allowed to be operated on public roads, at least until all of these flaws are 99% fixed. We need a safe car. Until then, I personally will never try these cars. I have just read an article that might be useful for us at https://www.lemberglaw.com/self-driving-autonomous-car-accident-injury-lawyers-attorneys/. Hope this helps.