Hackers Trick Tesla’s Autopilot System (w/video)

AUG 21 2016 BY ERIC LOVEDAY 26

Tesla Autopilot

Tesla Autopilot

Tesla Autopilot

Tesla Autopilot

There’s a highly detailed article posted over at Wired on how a group of researchers at the University of South Carolina, China’s Zhejiang University and the Chinese security firm Qihoo 360 managed to rather easily trick Tesla’s Autopilot system.

As Wired reports on the group of researchers:

“… [they] found that they could use off-the-shelf radio-, sound- and light-emitting tools to deceive Tesla’s autopilot sensors, in some cases causing the car’s computers to perceive an object where none existed, and in others to miss a real object in the Tesla’s path.”

We’re not going to provided the details here, as we certainly don’t want to contribute anything at all to those out there who may find it fun to try to mess with Autopilot, but we will say that the tricking of the Autopilot system seemed rather easy.

Wired adds:

“The demonstrations were performed mostly on a stationary car, in some cases required expensive equipment, and had varying degrees of success and reliability. But the research nonetheless hints at rough techniques that might be honed by malicious hackers to intentionally reproduce May’s deadly accident.”

Wenyuan Xu, the University of South Carolina professor who led the research team, stated:

“The worst case scenario would be that while the car is in self-driving mode and relying on the radar, the radar is obscured and fails to detect an obstacle ahead of it. That would be a bad thing.”

Watch the video and/or head over to Wired if you’d like more details:

Source: Wired

Categories: Tesla

Tags:

Leave a Reply

26 Comments on "Hackers Trick Tesla’s Autopilot System (w/video)"

newest oldest most voted

Researchers have also discovered a new simple way to trick traditional drivers.

They created a device that makes use of a piece of wood lumber called a 2×4 with dozens of hammered through heavy guage nails. This device was then thrown out into traffic thereby tricking drivers to drive their cars with flattened tires off into the grassy median.

Researchers are also working on another device to draw attention to and sow fear about how malicious individuals could do harm to drivers.

This low-tech device which is round and heavy and used in a sport called “bowling” would be deployed from a highway overpass to land on a car’s windshield thereby tricking the driver again into sliding off the road due to poor visibility out of the smashed windshield and/or due to injuries sustained.

That’s pretty much what I was thinking too. Hackers don’t usually go out of their way to invite an attempted or successful mansalughter or murder charge.

of course, before you can charge someone with manslaughter, you have to find the person who committed the crime. in the event that the car is hacked over a network, the hacker might not even be in the same country.

” in the event that the car is hacked over a network, the hacker might not even be in the same country.”

But that’s not even remotely what happened here. In this case the hackers were using a physical device and had to be present or nearby just like one would be if they were to “hack” a car manually with a bowling ball or road spikes. “… [they] found that they could use off-the-shelf radio-, sound- and light-emitting tools to deceive Tesla’s autopilot sensors,..”

It could be a serious problem if someone can successfully show that a Tesla can be hacked remotely like you suggest. That has yet to happen. There were a group of other hackers that could access a Tesla but they had to have physical contact with said car before they could remotely hack it.

Your argument makes no sense. You describe real road hazards and equate them with malicious inputs to a computer program. That’s like saying there is no need to secure a bank’s website, because someone can always walk in to a branch with a mask and a gun. They’re not the same–not at all.

Well, no, it is actually completely valid, even in the context of your analogy. This ‘hacking’ (a rather stupid and typically sloppy journalistic use of the term in this case in my view) did not alter or take control any of the computer systems in the car. It merely caused confusion to the cars sensors – which you could do by letting a potato crisp packet blow across the cars path. This certainly worked very well to confuse *my* Tesla a month or so back, causing AP to slam on the brakes momentarily (and for the first time in about 8k miles of AP driving). I’m assuming the metallic film in the packet’s plastic made for a very good ‘chaff’ or ‘window’ simulator!

Thank you. Exactly. These “hackers” used a physical device locally to confuse the sensors. Has nothing to do with hacking into the Tesla computer remotely.

The headline is a little misleading. It’s technically not exaggerating but the fact that these were hackers is irrelevant. They didn’t use their expertise in code to cause the autopilot feature to fail.

You could just as well have a title that said. “Hackers Drop Bowling Ball Onto Tesla Windshield From Overpass Causing Crash”

Lol. Good response to him logical scare tactic concerns.

Yea, it reminds me of the old Bond movies. Where instead of simply shooting somebody with a gun, they try to kill Bond with some unreliable and overly complicated death machine.

Besides, there is already a device that can blind drivers without autopilot that could cause them to crash. It’s called a laser light. Way easier to find and purchase too.

This falls in the category of waking up in a hotel room to find out somebody stole one of your kidneys.

Good response, philip d! Thanks.

I understand there is a similar disturbing weakness in the sort of driving controller used in most cars. This sort of controller is called a “human”. If a simple hand-held laser is shined into the eyes of such a human controller, there is a danger of blinding the visual system of a human, causing it to lose control of the car and possibly crashing.

Clearly these “humans” need to be upgraded to eliminate this danger, and use of them for controlling cars should be suspended until this upgrade is performed. /snark

The difference is that if someone shines a laser in your eye, throws a spike strip on the ground, is on a grassy knoll shooting out tires, etc., there is a certain level of physical engagement required to take those actions; one that necessarily exposes you as the perpetrator.

In contrast, this requires no more physical commitment than a cellphone signal jammer; a troublemaker could throw this device in the hatchback of his heavily-tinted car and no one would be the wiser.

It doesn’t take too much imagination to think of a person who might be inclined to teach the “ruling class” a lesson by having their shiny toys rear-end people in bumper-to-bumper traffic (where AutoPilot would be very useful and distraction would be maximized).

Spider-Dan:

This article says, in part:

“… [they] found that they could use off-the-shelf radio-, sound- and light-emitting tools to deceive Tesla’s autopilot sensors…”

You appear to be talking about wirelessly hacking into the car’s computer and disrupting the functionality by messing with the computer’s settings or software. That’s not at all what this article is about; this article specifies disrupting the functionality of the car’s hardware sensors, by using visible light, radio frequency waves (radar waves), and sound (presumably ultrasonic sound in frequencies used by a Tesla car’s ultrasonic movement sensors) to blind the sensors.

And that’s what I was referring to, when I made the analogy of shining a laser in someone’s eyes to using a “light-emitting tool” to dazzle a Tesla car’s sensors.

I could make a similar case for using loudspeakers to disrupt the ability of a human driver to hear things such as car horns and emergency sirens.

Now, that’s not at all to say you’re wrong about hackers possibly using a cellphone to hack into the car’s computer. But that’s not the subject of this article.

No, that’s not what I’m saying.

I specifically chose the comparison to a cellphone signal jammer, which doesn’t require you to hack someone else’s cellphone. It simply floods the radio airwaves to mess up the signal from your service provider.

Now, I can’t speak to the light sensors and whether there has to be some sort of obvious method to confuse them. But as far as radar (radio waves) and ultrasound goes, you can bombard other cars with scramblers and no one around you would be able to tell the difference.

I will keep driving for the foreseeable future. Self driving is not an option I want in my car.

You don’t explain why. As philip d noted, it is also easy to trick human drivers. In addition, there are dozens of far easier methods to do you harm apart from messing with your car’s self driving features. There are no doubt emotional reasons for negative reactions to self driving, but stories of malicious hacking as described in this article are pure publicity stunts that say absolutely nothing about the safety of the technology.

Well ok. First I happen to really like to drive, ever since my young age I always liked that and never found it a core or boring. I just love to drive, even in traffic jams or on commute. So that is the very simple main reason. But there is also a certain apprehension in giving over the steering wheel to someone else because I am already not so much at ease when I sit as a passenger in a car. For instance when my mother in law is driving I kind of feel stressed. She tends to race in the city and go slow on the freeway while I do exactly the opposite. She just drives her priorities no matter what while I take a more defensive attitude when needed and at contrary let it run when the space is open en clear. That is giving over the steering wheel to a just another person to which you can still talk or yell “watch out”, so that doesn’t bode well if there is no person at all and the wheel is just in the hands of a software in a box relying on sensors. I have not yet experienced… Read more »

Yours is a perfectly understandable, perfectly human reaction to the concept of surrendering control of the car you’re in to a machine. You can find similar reports from those who have “driven” one of Google’s self-driving cars.

And just as with those drivers-turned-passengers, it may be that once you’ve actually experienced having the computer in control, you might decide that it would be preferable, in many or most cases, to relax and let the computer drive.

Sure, there will always be a place for driving for pleasure. That’s why auto makers continue to make low-slung open-cockpit roadsters, and it’s why some buy them. Not for everyday use, but for an occasional spin for the pure pleasure of driving.

But I think once self-driving cars work reliably, the average driver will prefer to not have to deal with the stress of fighting traffic and constantly having to watch out for hazards on the road. Especially when it becomes clear to them that they’re actually safer if they let the computer drive.

http://www.fool.com/investing/general/2016/03/06/this-might-be-googles-biggest-hurdle-in-launching.aspx

I imagine Tesla is highly grateful for these efforts!

That seems like an overly complicated way to try and kill somebody, when it can be easily defeated by simply following the instructions and using autopilot correctly by simply paying attention to the road.

The problem is that you don’t know you need to ignore/turn off Autopilot until it’s too late.

I think Tesla will thank them both.

My car often experiences this problem. It’s called kids, and they regularly lead to situations of not recognizing objects on the road or indicating things that turn out not to really be present.

My ELR w/ACC does this by itself. If the lead car comes to a full stop then slowly moves forward then stops several times, the software can lose the ‘lock’ (the car icon disappears). It will accelerate like nothing is there.

That’s why we still have real licensed drivers in the cars. The driver is one of the redundant backup systems.

Let’s face it Tesla’s autopilot is still pretty average, it missed a truck turning! Yes I have no doubt that it will get better and no doubt that one day I will get in the car and not think twice about the fact that it is driving by its self but right now this is Tesla driver assist NOT an autopilot. My feeling is that autopilot makes a car a lot safer and is well worth engaging when driving but people will keep finding holes in the way it works. The other thing is, systems like this exist on a lot of modern cars and they are just as average on those cars as on a tesla.

Wouldn’t they have to defeat more than one system? For instance, if they jammed the radar, couldn’t the cameras still see the object??

Yes that’s true but in the same time there could be a conflict between contradictory sensors in which case it would be to the one that has priority. If it’s the radar…