Tesla Autopilot

Tesla Autopilot

A Stanford University robotics researcher shares her two cents about Tesla's features, and she seems less than thrilled about Tesla Autopilot.

Heather Knight was so unimpressed with the Autopilot's ability to see bikes, that she wrote a paper on Medium entitled "Tesla Autopilot Review: Bikers Will Die." Tesla Autopilot is not the only Tesla feature that was upsetting to Knight, but there were also a few features she did enjoy.

Update:  A follow-up/companion piece on Fortune had its title edited/softened to "Tesla’s Autopilot Tech Is a Danger to Cyclists, Robotics Expert Says", as the media outfit stated, "The headline on this post was edited at 3:17 on May 29 to more accurately reflect the nature of Tesla's semi-autonomous driving technology."

Autopilot is not full self-driving software.

Autopilot is not full self-driving software.

Knight holds a doctorate degree in robotics from Carnegie Mellon, and is currently studying social robotics. Checking on the Tesla Autopilot system was on her to do list. Another Stanford researcher accompanied her on a Tesla Model X test drive, to see just how well the car's features live up to expectations. More specifically, how do they handle the human-robot interaction experience? She discovered the Tesla's:

"Agnostic behavior around bicyclists to be frightening ... put biker lives at risk."

"I’d estimate that Autopilot classified ~30% of other cars, and 1% of bicyclists. Not being able to classify objects doesn’t mean the tesla doesn’t see that something is there, but given the lives at stake, we recommend that people NEVER USE TESLA AUTOPILOT AROUND BICYCLISTS!"

Thus far, there have not been any reported incidents between Tesla vehicles and bikes, but there was an accident in Norway involving a motorcycle.

The interesting part about all of this is that she says that the Autopilot works well in regards to cars. Yet, she says that it only identifies about ~30 percent of them? As she explains, the car attempts to show everything that the camera sees, but often it "sees" objects, but doesn't always classify/display them on the screen. So, while it may make the driver/robot interaction a bit scary, the car is potentially seeing the bikes, but not showing the bikes.

People took to Twitter to comment to Knight about her findings. She made it aware that she is a fan of Tesla, but she wanted to point out some issues. She calls the technology a "human in the loop" system, which is exactly what Autopilot is said to be. Tesla has not yet released its full self-driving software, and it has made this abundantly clear. In Knight's paper, she explains:

"The Tesla Autopilot feature is basically a button to turn the car into autonomous driving mode."

This statement is true because the car functions at Level 2 autonomy. But, this could also be seen as a stretch if we are reading the word autonomous as "self-driving" (Level 4 autonomy).

Knight hopes that people will understand the distinction. Despite her criticism, and the morbid title, she gave an 'A+' to Tesla's Situation Awareness Display, Automatic Lane Switching, and the Tesla App. She graded the Recessed Door Handles and Steering Around Curves with a 'B.' User Set Target Velocity and Giant Touchscreen get a 'C' from Ms. Knight, and she gave the Self-Locking Feature an 'F.' This is because they got locked out of the vehicle, but the Tesla App saved them.

To see her full detailed review, click the Medium link below.

Source: Fortune via Medium

Got a tip for us? Email: tips@insideevs.com