Fatal Tesla Crash Leads Some To Question On-Road Beta Testing

Tesla

JUL 7 2016 BY STEVEN LOVEDAY 30

Tesla Model S Fatality May Be Blamed On Autopilot Mode

Tesla Model S Fatality May Be Blamed On Autopilot Mode (via ABC News/Bobby Vankavelaar)

The Autopilot Mode, a self-driving feature installed on Tesla Motor’s vehicles, may be blamed for a fatal crash that occurred last month. Tesla is saying the technology in development and improving rapidly, but nevertheless, it’s been installed on most every Tesla since October 2014. Beta testing on public roads will surely undergo further scrutiny.

AnotherRecent Tesla Model S Crash (Not Attributed To Autopilot). All 5 Occupant Survived Due To The Model S Top Crash Safety Ratings, Image Credit: Merkur - Sabine Hermsdorf

Another Recent Tesla Model S Crash (Not Attributed To Autopilot). All 5 Occupants Survived Due To The Model S Top Crash Safety Ratings, Image Credit: Merkur – Sabine Hermsdorf

Tesla’s autonomous technology does not make cars fully autonomous. It takes over steering and uses active cruise control. The cars can change lanes “safely” and will brake automatically. It alerts the driver when needed and if the driver is not engaged, the car will slow and come to a stop, with hazard lights flashing.

The real issue may be that the confidence level of the user with the system is too high, or even the complacency in its ability that can build up over time as more and more miles are logged by the owner.

Tesla reported:

“Autopilot is by far the most advanced driver-assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility. Since the release of Autopilot, we’ve continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times.”

The crash currently under investigation involved a 40 year old man from Ohio that drove his Tesla Model S under a semi truck trailer in Florida. The truck driver believes that the Tesla driver was watching a movie. This has not yet been substantiated.

Update (Thursday, July 7th):  Florida Highway Patrol said Thursday that both a computer (laptop) and a DVD player were confirmed in the vehicle, but neither were found running after the crash.  Investigators on the scence could not determine whether the driver was operating either of the two at the time of the accident.

Apparently, the Autopilot feature didn’t “see” the white side of the vehicle against the bright white sky. Autopilot has been blamed numerous times for crashes, but in the past, logs have shown that the technology was not engaged or to blame.

Tesla still claims that autonomous cars are safer than human drivers. There were 35,200 highway deaths in 2015. 94% of vehicle crashes are attributed to human error. This is the first reported fatality in over 130 million miles of Tesla Autopilot driving.

Advocates for Highway and Auto Safety have been fighting against the “anything-goes” mentality of the U.S. government in regards to tougher safety regulations. The group’s president, Jackie Gillian, said:

“Allowing automakers to do their own testing, with no specific guidelines, means consumers are going to be the guinea pigs in this experiment. This is going to happen again and again and again.”

The National Highway Traffic Safety Administration (NHTSA) is set to release new guidelines for self-driving cars as soon as this month. The NHTSA announced in January in Detroit that applications for exemptions to current safety rules could be granted to companies showing safety of autonomous vehicles. It was an attempt for the government to move out of the way of technological innovation. Transportation Secretary, Anthony Foxx, confirmed:

“We’re crafting a Declaration of Independence, not a Constitution.”

“We want to minimize the road kill. You set standards for testing so everyone is abiding by the same rules. You can’t just let these companies put this technology on the road and let owners find their own way. That’s not good enough.”

Beta testing has become the way to go for new technology. It has been found that putting the product in people’s hands in real life situations provides that best feedback, and the issues are far outweighed by the benefits. But, for transportation, this can become a life or death issue, which is much different than an iPhone crash or compatibility bugs in Windows. Joan Claybrook, a prior NHTSA boss said:

“They shouldn’t be doing beta-testing on the public. The history of the auto industry is they test and test and test. This is a life-and-death issue.”

Google has logged over 1.5 million miles of autonomous vehicle testing on public roads. However, a test driver is present in every vehicle. It was the first company to deal with reports of an autonomous car crash. The crash was very minor and there were no injuries. Google reported:

“In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision.”

BMW just announced ramping up its self-driving tech with partners Intel and Mobileye. However, CEO Harald Kreuger commented on the crash:

“First of all, the news of the accident is very sad. Technology for highly autonomous driving isn’t ready for series production yet.”

Source: Autonews

Categories: Tesla

Tags: , ,

Leave a Reply

30 Comments on "Fatal Tesla Crash Leads Some To Question On-Road Beta Testing"

newest oldest most voted
electric-car-insider.com

Videos posted on youtube showing the front seat occupants playing games, demonstrating hands free driving or even sleeping make it clear that too many Tesla AP drivers don’t understand the systems limitations.

Even for drivers reasonable well briefed, some take risks they wouldn’t with less capable systems based on personal comments I’ve heard from Tesla AP drivers while chatting at superchargers.

I’m wondering how many Tesla drivers truly understand the limitations of AP? How many maintain situational awareness at all times, ready to take full contol and avoid a collision that AP can’t?

Tesla could and should fix the education deficit asap.

you can’t count on companies to do the kind of thing that you suggest. that’s why you need regulation. that isn’t to say that there should be *no* road testing. it’s that such testing needs to be regulated, licensed, controlled and limited. tesla isn’t doing that and they have no motivation to limit the deployment of the autopilot feature. after all, tesla sells that as an option, so tesla actually has a financial interest in *selling* a beta-test product.

What about semi automatic rifles ?

What about controlled clinical trials for new drugs before they are available to the public?

if you are asking whether i think that sales of semi automatic rifles should be regulated and limited, i would say: absolutely! the problem is that we have too many idiots on the US supreme court, who lack common sense and are out of touch with how things actually work in the real world, and they’re opinions differ from mine.

You have to wonder if education is enough. The system has to be more aggressive in detecting driver involvement, even if it’s annoying to the driver.

Also future systems should have some kind of laser scanner to detect distances to objects. Visual detection is not enough, or some kind of radar high on the front to avoid the weakness in the current system. Maybe even a mass recall to fix that deficit.

Interior cameras and face detection software used to detect driver drowsiness and micro-napping can also be used to determine that a driver has his eyes on the road.

there are a lot of people who are offering up good ideas here. but good ideas belong in the test lab and on the test track; they have no place in being sold as options to anyone with the cash to pay for them. what we have seen among tesla drivers is that many of them appear to have more dollars than they have sense.

AlphaEdge said: “You have to wonder if education is enough. The system has to be more aggressive in detecting driver involvement, even if it’s annoying to the driver.” Indeed. The first reports of using Tesla AutoSteer indicated the system would only allow “hands free” operation for about two minutes before beeping and making the driver indicate he was still in control, by taking the wheel briefly. But later reports lacked any such periodic testing. Did Tesla remove that safety feature? Or was it just that early deployment of AutoSteer wasn’t reliable enough to operate for more than two minutes at a time? Or was this a case of anecdotal reports giving readers a skewed understanding, with perhaps something that was rare overall being reported as commonplace? At any rate, given the amount of abuse documented with drivers becoming passengers, turning away from looking at the road, watching a movie or talking to a passenger or video camera, it does seem like Tesla should be much more aggressive in frequently testing drivers using AutoSteer to see if they’re actually paying attention. “Also future systems should have some kind of laser scanner to detect distances to objects. Visual detection is not enough,… Read more »

In the fairly recent past, Musk was quoted as saying that lidar isn’t good enough. I do not remember if/what his reason(s) are.

To compare what Google do to what Tesla do is wrong IMO.

Google Car Drivers are well aware they are testing new Tech and well aware of limitation and are ready to take over at all time.

Tesla Drivers can barely connect their iPhone to the car and expect do nothing when they engage the “autonomous” feature. When a problem occurs, Tesla hide themself in a “we told the client the limitation and the log shows the car behaved how it was intended to” This isn’t beta testing, this is being careless just to be perceived as a leader. It also shows the car should be better at keeping the attention of the driver when condition are not optimal and should even block/disengage the feature when it happen.

People complain when the BMW i3 disengage ACC when the Sun is blinding the camera and yet this is the right thing to do. It also disengage when it doesn’t recognize objects. It’s the right thing to do as it remind the driver the limitation of the tech.

franky_b said: “When a problem occurs, Tesla hide themself in a ‘we told the client the limitation and the log shows the car behaved how it was intended to’ This isn’t beta testing, this is being careless just to be perceived as a leader.” Well that’s one way to look at it, and perhaps it’s a valid viewpoint. But I think it’s equally valid to say that Tesla is showing itself to be the industry leader by actually having the courage to deploy advanced driver assist features (or limited, baby-step autonomous features) in their cars, and thus push forward the technology in ways that other auto makers are too timid to do. For me at least, the bottom line is this: Does the feature save more lives than it costs? If so, then it’s worth doing. While I agree that Tesla’s AutoPilot suite needs a lot of improvement, it seems to me those demanding that Tesla disable its use entirely until it’s greatly improved are committing the fallacy of “The perfect driving out the good.” Even in the future when we have fully autonomous cars, that won’t prevent accidents from happening. But it will make them more rare; hopefully a… Read more »

You’re essentially saying that Tesla has the courage to be reckless. I would not describe that action as courageous.

To the “top safety ratings”, the Renault Zoe rating is better in EURONCAP….

This start to divert attention from the real core of electric vehicles, moving from fossil fuel transportation to renewable transportation. More energetic batteries, faster recharge, lower cost, bigger production is the objective. Autopilot is something that is not directly related to the electric vehicle so there is no reason to add that testing burden on it any more than on ice cars. Making the electric transition is already hard enough in itself.

Are we going to ask space X to be leader in biomethane production as well? Or is the task of pioneering reusable rockets an endeavor big enough?

Autonomous systems lack the ability to predict actions (and potential accidents) before they happen. Seeing a truck & trailer in a left turn lane poised to cross over your driving path would cause any human to take note as they approach the point where the crossing would take place.

Perhaps current software suites used in partially autonomous cars are too limited for such predictive ability, but there is absolutely no barrier to using predictive analysis and logic in computer programs. I’m sure I’ve read of more sophisticated expert systems (so-called “A.I.”), and many computer/video games, which use predictive software. Also, consider this: What would the human driver do when recognizing the potential danger of someone pulling out in front of him from a left turn lane? He would watch the vehicle closely, and perhaps look around into adjacent lanes to see if there’s room to move over if he needs to change lanes suddenly. He might also reduce his speed, altho that’s less likely. The autonomous system has no need to watch the vehicle in question more closely. It can react instantly, unlike the human. And also unlike the human, it is always scanning the lanes around it, so no need to “look around” to see if there’s room to make a sudden lane change. Indeed, the only thing the autonomous system could do in such a case to reduce the risk of an accident is to reduce speed. And I rather doubt that most drivers would want a… Read more »
Tesla is totally at fault here. The autopilot system is a Level 2 system, which means it can aid the driver. However, in order to continue the narrative that Tesla has the most advanced tech — which is clearly doesn’t — Telsa markets autopilot as Level 3, where the car can drive itself AND respond to emergency situations. (Note Tesla says it has the most advanced autonomous product “on the road”). The result is absurdly inconsistent directions. On the one hand the car can drive itself and is safer than a human driver — so go ahead and read the paper. On the other hand the driver is supposed to be ready at a moment’s notice to take back control of the car. Since you can’t forget about driving AND be prepared to instantly take control of the car, these directions are impossible to reconcile and hence pose a safety risk. In order for autonomy to be safe there needs to be a clear distinction between the two situations. If Tesla can’t or isn’t willing to do this then it needs to take the feature out of the car or take the cars off the road. It’s perfectly fine for… Read more »

you had better believe that tesla has their attorneys working overtime. the first question that they are going to have to answer is: how did tesla screen beta testers? oh, the “screening” mechanism was whomever was willing to shell out a few thousand dollars?

tesla is looking at potentially HUGE legal liability in this for no less than negligence. it seems unlikely that tesla could beat this by pointing to “fine print” in their sales agreements. you would have thought that tesla attorneys would have warned them about this beforehand.

I don’t get it, the driver was over the limit seed 20 miles, on a route that has lights, watching movies and a near miss accident uploaded in YouTube by the same driver and is Tesla’s fault? You Americans never take personal responsibility, always is somebody else fault.

Stereotype much?

It’s amazing how provincial some Europeans are. But unlike you, I don’t make the mistake of thinking all Europeans are so lacking in their understanding of others.

Math says human in car with autopilot operating is safer than human in car without autopilot not operating.

Math say a planet without humans would be better for the environment.

Tesla got this wrong, but it’s hard for Me to say that the non-AP result would have been different. That highway is another example of a Crap road that we’ve all traveled and marveled that they still exist — I Really want to see the historical accident data for this intersection, as one like it in my area took ten Years to get even a Warning light, after Many horrendous crashes, some with fatalities.

(and my imagination perceives the possibility of loading AP with ALL the notably BAD intersections into the database, basically predetermining that the driver needs to take control in those sections)

The non-AP result would certainly been different had the driver been watching the road instead of a movie. Straight, flat highway, excellent visibility, how hard is it to see ahead that a big truck is set to make a turn in front of you?

That roads and byways lack consistent standards for virtually everything from markings to entry/exit access points is why AP systems as they exist now should not be permitted on public roads under the control of average consumers.

I believe I read that the view from the turning vehicle was Not as good as you describe, but my memory is what it is – that recollection was why I made the bad road comparison, and would Still appreciate someone looking up the stats on that intersection, because IMO, if it’s one of our 50 accidents in 5 years specials, it changes the perspective considerably.

This is a very good idea to warn drivers of an upcoming dangerous intersections. It could also automatically temporarily reduce speeds at such intersections. Would surprise me if such updates weren’t already in the pipeline. Not all accidents can be prevented but I think we will greatly reduce the number of them in the near future.

Also note the latest CUV/SUV to show up on the InsideEVs sales table has similar features of AutoSteer and Adaptive [Traffic-Aware] Cruise Control (autopilot overall terminology). http://insideevs.com/all-time-ev-sales-record-easily-set-in-us-for-june-props-to-tesla-and-ford/

Mercedes GLE 550e – small battery plugin hybrid CUV/SUV

They call it “DISTRONIC PLUS with Steering Assist” but it appears to be the same functionality as Teslas but just a different name. Drivers will use it the same way.

“DISTRONIC PLUS with Steering Assist” defined on their page as:
This radar-based cruise control adapts your set speed to the flow of traffic ahead, automatically slowing until your path is clear again.

If the vehicle ahead slows to a stop, DISTRONIC PLUS® can brake your car to a full halt. When traffic moves, you can resume with just a tap, or automatically if the stop is under a second.

The system’s Steering Assist feature helps the driver keep the vehicle centered in between the lane markings while cruising on straight roads, or even in gentle curves.
Via: http://www.mbusa.com/mercedes/vehicles/build/class-GLE/model-GLE550E4#tab=tab-performance

What about all these as well.
Subtitle: Allows unassisted driving under limited conditions
https://en.wikipedia.org/wiki/Lane_departure_warning_system#Lane_Keeping

Year Make Model
__ 2014 __
Tesla Model S
__ 2014-2015 __
Infiniti Q50
__ 2015 __
Mercedes C-Class, E-Class, S-Class
Tesla Model S, Model X
Volkswagen Passat
Volvo XC90 II
__ 2016 __
Volvo S90 II,[73] V90 II, XC90 II
Audi A4
Mercedes GLE 550e — Steering Assist feature helps the driver keep the vehicle centered in between the lane markings while cruising on straight roads, or even in gentle curves.