Tesla Model S Fatality May Be Blamed On Autopilot Mode (via <a href=ABC News/Bobby Vankavelaar)" draggable="false">

Tesla Model S Fatality May Be Blamed On Autopilot Mode (via ABC News/Bobby Vankavelaar)

The Autopilot Mode, a self-driving feature installed on Tesla Motor's vehicles, may be blamed for a fatal crash that occurred last month. Tesla is saying the technology in development and improving rapidly, but nevertheless, it's been installed on most every Tesla since October 2014. Beta testing on public roads will surely undergo further scrutiny.

Another Recent Tesla Model S Crash (Not Attributed To Autopilot). All 5 Occupants Survived Due To The Model S Top Crash Safety Ratings, Image Credit: <a href=Merkur - Sabine Hermsdorf" draggable="false">

Another Recent Tesla Model S Crash (Not Attributed To Autopilot). All 5 Occupants Survived Due To The Model S Top Crash Safety Ratings, Image Credit: Merkur - Sabine Hermsdorf

Tesla's autonomous technology does not make cars fully autonomous. It takes over steering and uses active cruise control. The cars can change lanes "safely" and will brake automatically. It alerts the driver when needed and if the driver is not engaged, the car will slow and come to a stop, with hazard lights flashing.

The real issue may be that the confidence level of the user with the system is too high, or even the complacency in its ability that can build up over time as more and more miles are logged by the owner.

Tesla reported:

“Autopilot is by far the most advanced driver-assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility. Since the release of Autopilot, we’ve continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times.”

The crash currently under investigation involved a 40 year old man from Ohio that drove his Tesla Model S under a semi truck trailer in Florida. The truck driver believes that the Tesla driver was watching a movie. This has not yet been substantiated.

Update (Thursday, July 7th):  Florida Highway Patrol said Thursday that both a computer (laptop) and a DVD player were confirmed in the vehicle, but neither were found running after the crash.  Investigators on the scence could not determine whether the driver was operating either of the two at the time of the accident.

Apparently, the Autopilot feature didn't "see" the white side of the vehicle against the bright white sky. Autopilot has been blamed numerous times for crashes, but in the past, logs have shown that the technology was not engaged or to blame.

Tesla still claims that autonomous cars are safer than human drivers. There were 35,200 highway deaths in 2015. 94% of vehicle crashes are attributed to human error. This is the first reported fatality in over 130 million miles of Tesla Autopilot driving.

Advocates for Highway and Auto Safety have been fighting against the "anything-goes" mentality of the U.S. government in regards to tougher safety regulations. The group's president, Jackie Gillian, said:

“Allowing automakers to do their own testing, with no specific guidelines, means consumers are going to be the guinea pigs in this experiment. This is going to happen again and again and again.”

The National Highway Traffic Safety Administration (NHTSA) is set to release new guidelines for self-driving cars as soon as this month. The NHTSA announced in January in Detroit that applications for exemptions to current safety rules could be granted to companies showing safety of autonomous vehicles. It was an attempt for the government to move out of the way of technological innovation. Transportation Secretary, Anthony Foxx, confirmed:

“We’re crafting a Declaration of Independence, not a Constitution.”

“We want to minimize the road kill. You set standards for testing so everyone is abiding by the same rules. You can’t just let these companies put this technology on the road and let owners find their own way. That’s not good enough.”

Beta testing has become the way to go for new technology. It has been found that putting the product in people's hands in real life situations provides that best feedback, and the issues are far outweighed by the benefits. But, for transportation, this can become a life or death issue, which is much different than an iPhone crash or compatibility bugs in Windows. Joan Claybrook, a prior NHTSA boss said:

“They shouldn’t be doing beta-testing on the public. The history of the auto industry is they test and test and test. This is a life-and-death issue.”

Google has logged over 1.5 million miles of autonomous vehicle testing on public roads. However, a test driver is present in every vehicle. It was the first company to deal with reports of an autonomous car crash. The crash was very minor and there were no injuries. Google reported:

“In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision.”

BMW just announced ramping up its self-driving tech with partners Intel and Mobileye. However, CEO Harald Kreuger commented on the crash:

“First of all, the news of the accident is very sad. Technology for highly autonomous driving isn’t ready for series production yet.”

Source: Autonews

Got a tip for us? Email: tips@insideevs.com