Tesla Autopilot

Tesla Autopilot

Tesla Autopilot

Tesla Autopilot

There's a highly detailed article posted over at Wired on how a group of researchers at the University of South Carolina, China’s Zhejiang University and the Chinese security firm Qihoo 360 managed to rather easily trick Tesla's Autopilot system.

As Wired reports on the group of researchers:

"... found that they could use off-the-shelf radio-, sound- and light-emitting tools to deceive Tesla’s autopilot sensors, in some cases causing the car’s computers to perceive an object where none existed, and in others to miss a real object in the Tesla’s path."

We're not going to provided the details here, as we certainly don't want to contribute anything at all to those out there who may find it fun to try to mess with Autopilot, but we will say that the tricking of the Autopilot system seemed rather easy.

Wired adds:

"The demonstrations were performed mostly on a stationary car, in some cases required expensive equipment, and had varying degrees of success and reliability. But the research nonetheless hints at rough techniques that might be honed by malicious hackers to intentionally reproduce May’s deadly accident."

Wenyuan Xu, the University of South Carolina professor who led the research team, stated:

“The worst case scenario would be that while the car is in self-driving mode and relying on the radar, the radar is obscured and fails to detect an obstacle ahead of it. That would be a bad thing.”

Watch the video and/or head over to Wired if you'd like more details:

Source: Wired