Tesla’s Fatal Autopilot Crash Won’t Halt Autonomous Car Development, Says NHTSA Chief

JUL 27 2016 BY ERIC LOVEDAY 33

Scene Of May 7th Tesla Model S Fatality (via ABC News/Bobby Vankavelaar)

Scene Of May 7th Tesla Model S Fatality (via ABC News/Bobby Vankavelaar)

Autopilot Engaged

Autopilot Engaged

US National Highway Transportation Safety Administration’s (NHTSA) chief Mark Rosekind recently spoke at the Automated Vehicles Symposium in San Francisco. And while he didn’t specifically mention the fatal Tesla Autopilot crash, his statements clearly acknowledged the incident.

Quoting Rosekind:

“No one incident will stop the NHTSA from promoting highly automated driving development.”

Rosekind did mention a “recent incident” though, which we can rightly assume was the fatal Tesla crash.

According to CNET,

“Rosekind insisted that the US Department of Transportation is taking a “forward leaning” position on highly automated driving. He justified that by citing the 32,500 traffic fatalities that occurred on US roads in 2015, and how 94 percent of those were due to driver error. He said that highly automated driving could eliminate 19 out of 20 collisions, potentially saving 30,000 or more lives per year.”

Rosekind added that the NHTSA is okay with today’s autonomous cars, even though they are imperfect. Rosekind says it’s a necessary step towards highly automated cars that have vast life-saving potential.

Source: CNET

Categories: Crashed EVs, Tesla

Tags: , ,

Leave a Reply

33 Comments on "Tesla’s Fatal Autopilot Crash Won’t Halt Autonomous Car Development, Says NHTSA Chief"

newest oldest most voted
no comment

of course an “incident” isn’t going to stop autonomous car development, the real issue has been one of how fast you deploy the systems to the public. so the real question to ask the nhtsa chief is: what does he think about the manner of deployment being used by various automakers?

M Hovis

The advancement of this technology requires both risk and critical thinking.

If we are losing 30,000 people to human error accidents per year, it seems just blatantly obvious that the risk of aggressive deployment saves lives in any/ strike that, every statistical model.

The real people to cry foul would be those that were killed by tools of autonomous driving. It comes down to the 30,000 deaths per year, and understanding that EVERYONE is at risk who travels on our highways.

If we aggressively pursue autonomous drive to the point that 100 people die due to the technology push, and we arrive a year earlier due to that push, then 29,900 more live are saved. I feel for the 100, but it is about the good of the many not the few.

no comment

no it is not “blatantly obvious”. if the guy in florida had run into *your* car, you wouldn’t be writing the stuff that you are writing.

aside from that, you math totally makes no sense. you may not realize it, but you don’t know what risks would be created if 100% of cars were converted to autonomous driving overnight. your statements are based on unsubstantiated assumptions. if *you* were the victim of the unintended consequences of a malfunctioning system, you would be singing a completely different tune from the one that you are singing now.

Kdawg

Can’t use anecdotal incidents. Need the data for the complete system to make an informed decision.

no comment

are you responding to my comment or to someone else’s?

Kdawg

To your comment about, “if it happened to you”. It’s like if someone died from a seat-belt due to a freak accident. Would that person’s family be against seat-belts? Maybe. But that doesn’t mean we should get rid of seat-belts due to the overwhelming data they save many more lives than they end.

no comment

maybe i was too subtle in my original comment. when i stated that the real question that should have been asked of the nhtsa chief was what did he think of the manner of deployment “by various automakers”, i was referring to tesla. specifically, tesla’s manner of deploying autonomous trying technology makes no attempt to qualify test drivers other than to sell the feature to anyone who is willing to pay money to tesla. the question is: is that really a responsible manner of deploying a new feature in an automobile. i appreciate that some of you like to invoke analogies to beta-testing in personal computing, but a car is very different from a personal computer.

kdawg

We don’t know how much testing Tesla does before it releases AP features. Again.. data is our friend.

Dana Pearson

Are you the same troll that compplained so loudly about the dangers of the model A replacing the horse and buggy? Your blather is that of a sad little man.

Impuning the motives of someone dedicated to doing ALL they can to save our f’d up world… Attributing it to profit motives… Is disgusting.

Pushmi-Pullyu

“no comment” commented:

“if the guy in florida had run into *your* car, you wouldn’t be writing the stuff that you are writing.”

You completely and utterly ignored his point about the greatest good of the greatest number.

Your attitude would have us abandoning development of a system which will probably, eventually, save tens of thousands of lives every year; abandoning that to protect the relative few who would — and will — be killed during the development. Not only is that selfish, it’s incredibly short-sighted.

“…you don’t know what risks would be created if 100% of cars were converted to autonomous driving overnight.”

And that’s a straw man argument. First of all, that would be physically impossible; second, nobody is suggesting that 100% of cars on the road be converted to an autonomous or semi-autonomous system overnight.

no comment

what my comment really is is that people always talk of “greatest good” when they think that they are talking about someone else. as to “strawman” allegations; you allege that i am suggesting that the autopilot technology be abandoned when you can’t point to any comment that i have ever made to that extent.

my main criticism of tesla is that they should test new features the way that other companies do; instead, they roll it out as a marketed a for-sale feature to anyone willing to pay money for the feature; as though it were a final product. then tesla seeks to “cover” for themselves by merely issuing tons on fine print, and then saying: “see??? we warned them right here…on page 329, line 15”.

Pushmi-Pullyu

The question is, which approach is going to save more lives? Which approach is going to result in faster improvement in semi-autonomous cars?

Is it the more timid, slow approach other auto makers are taking? Is it Google’s approach; developing a superior system but not letting anyone use it? Or is it Tesla’s; putting the “beta” system into the hands of actual customers, and letting them test the system, so that it can be — and is — improving the safety of the public, of ordinary drivers, passengers, and pedestrians on the road, much more rapidly than the approach anyone else is taking?

“no comment”, others have lumped you in with the hard-core Tesla bashers. I’m not willing to put you into that category, but the only reason I can see for you continuing an argument that I think very nearly everyone can see is just as wrong as you can possibly be; an argument against something that is going to save the lives of a growing number of actual human beings… is that you apparently have some sort of axe to grind or hidden agenda against Tesla Motors.

Dana Pearson

Says the troll from GM

Kdawg

Don’t bring GM into this.

MDEV

Why you exonerated the irresponsible driver in Florida, he was speeding in a route with light, and witnesses of the accident said that a movie still playing in the car at the time of the accident. It’s time to accept self responsibility, yes sensor should be working better but it was the driver’s fault to be so distracted that he missed a huge 18 wheel truck crossing the whole way. Sadly accidents show deficits in systems that not one could oversight before. The systems will be improved and more education an awareness about the upcoming technology perhaps will reduce this incidents. Texting and Pokemon kills way more people every day.

no comment

i agree with what you are stating. my position is that the driver shouldn’t have had the autopilot feature in the first place. my position is that for a beta test product like this, the deployment should be limited to a select group of test drivers, much as other auto makers have done.

the tesla auto pilot beta test is very different from the active-e test program that bmw was doing, although i suspect that even for the active-e program, bmw screened their test drivers a lot better than has tesla.

Nix
no comment — You mean if a Tesla ran into the side of my family 18-wheeler, where the large white slab side didn’t trigger automatic braking? Your attempt to generalize ONE failure mode to ALL cars is absolutely absurd. If you want to see an EXACT situation where an actual passenger car made a left turn in front of a Tesla, and the Tesla stopped automatically, the video of it is right here in the InsideEV’s archives. If YOUR family had been inside THAT ACTUAL PASSENGER CAR that pulled in front of a Tesla, YOUR LIFE AND YOUR FAMILY could have been saved by this exact technology. I’m not talking about some theoretical situation, like and your family you truck around inside the back of a white slab side panel truck. But an actual real event with a real passenger car where this system prevented an accident right on camera for everybody to see. Sadly your bullcrap is tolerated here because regular posters is only 1% of readers, so nobody will do anything about trolls like you. But that doesn’t mean you don’t deserve to have your bulls*** bashed to hell every time you make a fool of yourself here.
Dana Pearson

Bingo! Can’t stand this troll

speculawyer

GOOD! There will certainly be more crashes. But they are just missteps on a solid path to a much safer future with autonomous cars that perform better than human drivers who are subject to fatigue, drunkenness, inattention, emotional distress, sudden health issues, terrorism, insanity, etc.

(Yeah . . . that occurred to me after the terrible truck terrorism in Nice, France. An autonomous vehicle following Isaas Asimov’s 3 laws of robotics would never do such a thing.)

Those Laws – in Full:
—————————————–
Isaac Asimov’s “Three Laws of Robotics”

A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
——————————–
Assuming the Truck was a fully autonomous (robot) Truck – how would it even get into that crowd?!

Pushmi-Pullyu

My inner Grammar Nazi says:

Quoting from the article:

~~~~~~~~~~~~~~~~~~~~~~
Rosekind did mention a “recent incident” though, which we can rightly assume was the fatal Tesla crash.
~~~~~~~~~~~~~~~~~~~~~~

No, you can’t “rightly assume” anything. That’s an oxymoron. If you have actual evidence for a conclusion or an educated guess, then it’s not a mere assumption. If you don’t, then you don’t know whether your assumption is right or not.

So, Eric or Jay, you might want to re-word that sentence to read:

~~~~~~~~~~~~~~~~~~~~~~
Rosekind did mention a “recent incident” though, likely referring to the fatal Tesla crash.
~~~~~~~~~~~~~~~~~~~~~~

ffbj

I don’t think so. So you make an assumption and then later someone says you were right your assumption was correct. Nothing grammatically wrong with that.
Now semantics is another thing entirely as correct assumptions have nothing to do with grammar.
Grammar is proper usage not the meaning of words, though some linguists include semantics under the general umbrella of grammar.

Pushmi-Pullyu

“So you make an assumption and then later someone says you were right your assumption was correct. Nothing grammatically wrong with that.”

No, you can say — in the past tense — that your assumption was correct. But if you can “rightly” say something, in the present tense, that says or at least implies that you have sufficient grounds on which to base a reasonable assertion. In that case, it’s not an assumption; it’s a conclusion or at worst an educated guess. To call that an assumption is grammatically incorrect.

Now, I do think you have a valid argument to say that this isn’t really a grammatical error; perhaps it’s more properly an error of logic; a fallacy. Certainly that is a debatable point.

/zzzZZZ

ffbj

Well I thought of the temporal thing, anyway because no assumption can be proven correct or incorrect until the event occurs. Assumptions are ex post facto, so your tense argument has no value. See.
Personally saying rightly assuming something, grates on my ear, but I can’t agree it is wrong.

Pushmi-Pullyu

“Rosekind added that the NHTSA is okay with today’s autonomous cars, even though they are imperfect. Rosekind says it’s a necessary step towards highly automated cars that have vast life-saving potential.”

I’m very glad, and relieved, to see this pragmatic, common-sense approach to the development of autonomous driving. How refreshing to see that in a public official, rather than the obstructionist “The perfect driving out the good” attitude shown by Consumer Reports (in calling for Tesla Motors to suspend use of Autopilot and AutoSteer), and unfortunately also shown by all too many people posting comments on InsideEVs!

GRA
I’m someone who completely agrees with what I believe are common sense recommendations by CR. I also desire that we proceed with the development and deployment of autonomous driving systems. But we have to do so with ‘deliberate’ speed, not pushing the tech out to the public too fast, in such a way that the inevitable crashes and resulting publicity which are caused by such systems while they’re immature don’t turn the public off them altogether. This crash/fatality wouldn’t have happened if Brown had been driving the car and remaining alert. It’s obvious that Tesla’s (possibly everyone else’s) AEB sensor/software is currently inadequate for these conditions, which are hardly that unusual; after all, even on limited access freeways semis still jackknife across multiple lanes, there are lane closures while cherry picker cranes do maintenance, etc. Tesla’s simply pushing the tech too fast to allow the degree of autonomy they do, and then tries do give autopilot the credit for those accidents it prevents, while ducking the responsibility for those it causes. It’s got to be one or the other – either the driver is wholly responsible for every accident that occurs/doesn’t occur when using the system, or autopilot is. Until… Read more »
So the Question(s) about “This crash/fatality wouldn’t have happened if Brown had been driving the car and remaining alert.” are: 1) Was he awake at the time of the crash? 1a) If he was awake – was he dealing with any other medical problem at that very moment? 2) Since they found no hard evidence that any Laptop, or DVD Player was even playing ‘Harry Potter’ at their investigation – did they even find a DVD, or Digital File, of this move in those Devices? 2b) If the answer to the above – is no – they why did the Truck Driver Say that is what he heard playing – was it just to push blame? 3) Besides Movies, Phones, and car Controls – what other things can distract a driver from being vigilant in watching the road ahead? (Internal thoughts come to mind – since we now know his speed was 74 mph, not 90 or 100+ Mph, and there are many cases of people looking at the road ahead, but not actually seeing the information they are looking at, because they are mentally preoccupied! Even my Mother, once drove 5 miles into town, from our home farm, and… Read more »
Dr. Steelgood
There is still another issue against AutoPilot: How can it allow to go 74 mph when there is a limit of 65 ? On my test ride with a model S a year ago here in Gernany the car did not go beyond the speed limit when the cruise control was engaged. The second issue is the failed detection of the semi-truck which leads to two consequences: 1.) Trucks must have barriers around to prevent passing under, as I have posted previously, and some others here on Inside EV 2.) As I have read in one if my news forum, Tesla will terminate the collaboration with MobileEye, the system which did not recognize the truck. Maybe you guys know more about that – and there may be other systems on the market with better detection (perhaps Bosch ?). All in all, it was a coincidence of many negative effects and due to the fatal result of those I hope that we will see improvements on these fields in the future (barriers to prevent passing under, left turn on highway crossing other lane, enforcing speed limit, checking driver’s alertness, better recognition and correct interpretation of objects and their size compared to… Read more »
MDEV

Wow you already judged Tesla not even NHTSA. Autopilot is a great advance and thousands of drivers like me used the system in a responsible way. What are the accident Autopilot is causing please explain. Autopilot did not caused Florida accident, was the driver. Autopilot failed to prevented which is a huge difference.

Pushmi-Pullyu
GRA said: “…not pushing the tech out to the public too fast, in such a way that the inevitable crashes and resulting publicity which are caused by such systems while they’re immature don’t turn the public off them altogether.” I don’t care how slowly the semi-autonomous systems are “pushed out”; there are going to be serious, and fatal, accidents that happen to cars using the tech. I submit that the main reason that’s going to happen is that such accidents are fairly common, and it’s going to take awhile before the semi-autonomous systems are improved to the point that we can call them truly autonomous, and are improved to the point that they can prevent 95% or more of accidents. In other words, the problem isn’t that these systems are going to be “pushed out” too fast, but that they’re going to be pushed out too slowly, exactly because of the fear of something new that you express in your post. That’s a perfectly human reaction, but it’s an irrational emotional one, not a logical or rational one. The rational approach would be for everyone to strongly advocate that these systems should be rolled out as fast as possible, so… Read more »
MDEV

Well said +100

no comment

this is why insideevs need to commit itself to writing articles about MERCEDES-BENZ. we wouldn’t be having all these problems if we were talking about the benz-o s-class instead of the tesla model s.