Don’t Do This: Tesla Autopilot Tested Without Person In Driver’s Seat – Video

NOV 9 2015 BY MARK KANE 53

Tesla Autopilot Tested Without Person In Driver's Seat

Tesla Autopilot Tested Without Person In Driver’s Seat

One of the Tesla Model S owners in the Netherlands, and a self-called idiot (or at least framed as that by the uploader of this video to YouTube), presented a video of a Model S driving on the highway under control of Tesla Autopilot, but without an occupant in the driver’s seat.

Well, that would void all the rules of use of Tesla Autopilot, which isn’t autonomous (just assists the driver and requires driver attention for safety reasons, to quickly take control if needed).

The problem with such tomfoolery is that you risk not only yourself, but the lives of others too. The author of the video knows that what he did is wrong.

Perhaps stunts like this should/will lead to more actual safety protocols being added in the future to enable the feature, (over just ‘instructions on use’), such as sensors in the steering wheel to keep drivers where they are supposed to be while Autopilot is engaged.  You can’t just tell ignorant people to smarten up, especially when other people’s lives are potentially on the line.

Video description:

Total idiot tries Tesla autopilot without driver on Dutch Highway 

Stupid Idiot is in the back seat while his car is driving in autopilot on the highway with 83 KM / Hour

Private property? NO
Road closed to all other traffic? NO
No bystanders in any potential path of the vehicle? NO
Passenger ready to brake, steer, or otherwise control the vehicle as needed for safety? NO
Speed as low as possible 18 MPH? NO
Tested with driver multiple times prior? NO
Straight section of road that probably could have been done like this without autopilot in the first place? NO”

Categories: Tesla, Videos

Tags: ,

Leave a Reply

53 Comments on "Don’t Do This: Tesla Autopilot Tested Without Person In Driver’s Seat – Video"

newest oldest most voted

I think this guy is an idiot too. However, Dutch highways are designed to be pretty straightforward pieces of road. Usually the only reason I have to do something on a Dutch highway is because people are idiots who think it’s ok to push me into action..

If the autopilot can’t handle this piece of road without human intervention then it shouldn’t exist all together. Did anyone also notice that it went rather well?

The most dangerous situations on dutch highways are from people getting bored. The highways here are born for self driving vehicles.

Oh yeah, the article mentioned something about endangering bystanders. Crossing or going actually even near highways on foot is kinda illegal. So really, no bystanders.

I’m not going to say that this guy is doing the right thing but don’t exeggerate if you don’t know the situation please.

“If the autopilot can’t handle this piece of road without human intervention then it shouldn’t exist all together. ”

You JUST DON’T GET IT, do you? Yes, what you’re saying is true, but that’s why it’s IN BETA FOR CRYING OUT LOUD. You should ALWAYS be ready to grab the wheel at ALL TIMES until such a day and age comes when self-driving cars are deemed absolutely safe to leave unattended, REGARDLESS of ‘how easy the road is to drive’.

Wow are you from some lobby organisation or something what is wrong with you. I didn’t say autopilot shouldn’t exist. You cherry picked me and got angry about it. I’m a person who on purpose refuses to see anything black or white and this reflects in my post.

I think you just don’t get it.

There are lots of places in the United States were there are miles of open road that are straight with one on on it. But the reality is I wouldn’t do something like this if there is broken metal on the road or a deer jumps out across the highway. I would be worried about the liability if you hit someone or someone in the car gets injured.

You are as a much a complete and utter fool as the lunatic that posted the video. These utterly moronic antics not only endanger the safety of normal, law-abding road users but also seriously risk the withdrawal of AP by Tesla and make it more difficult for proposed future pro-autonomous vehicle capability legislation to succeed. Please keep you childish comments to yourself in future. MW

That might be, why Volvo has limited the top speed of their autopilot to 30 mph at the moment.

I feel like the last thing we should do is give this video more exposure. In the end, the video was probably created with the intention of getting some real ad revenue, plus there is always the possibility of copy cats.

I must admit I’m perplexed as to why Tesla would allow AutoPilot to function when there is no seat occupancy and/or the seat belt is not in use, especially during a beta. The sensors are there, it’s just a few lines of code.

The i3 can auto-steer when driving under 40km/h in a traffic jam (EU-only and only works on the Autobahn in Germany). It will start beeping shortly after both hands are off the wheel to force you to always keep them there.

My guess is that either voluntarily or forced, Tesla will have to add something similar. In the videos I’ve seen it only required hands on the wheel when the car thinks it’s in a challenging situation and a single touch every couple of minutes to show the car you’re still alive.

This. Why Tesla would not program some sort of “driver must be present” using the seat sensors is puzzling.

To me it’s an example of the lack of a thorough legal review of features in a small start-up like Tesla, compared to the big automakers.

V7 Autopilot will not engage, unless a seat belt is fastened. So, there’s your good intention, and perhaps a fastened belt we cannot see. Also, reverse deactivates without a butt in the seat (seat sensor). Anyone who lifts off a front seat, to turn and back up, has probably experienced a Tesla going into neutral.

It’s unfortunate. While I wouldn’t go around a track, with one hand at 12 O’clock, a stunt like this means one, or two, hands may have to be on the wheel, at all times, potentially even in 5 mph stop and go. I don’t agree with that, and am sure some here would disagree with me.

Well in the posted video, the fasten seatbelt indicator is blinking, so it doesn’t look like he needed to do that.

A future revision should use the seat-based detection as well. If a person is not seatbelted and is not detected as sitting in the seat for more than 5 seconds, place hazard lights on and slowly coast to a stop, until autopilot is disengaged.

I see a New Download coming from

My thoughts exactly – the seat sensor shuts the motor off any time you are lifting your bottom to get a better look behind while at a standstill.

In autopilot beta, the car should bomb you with warning sounds if you leave the seat and go into hazard mode after 5-10 seconds, pull over and come to a full stop.

the video is already a copy of the original video

“. . . plus there is always the possibility of copy cats.”

Not just copy cat videos, but one upsmanship videos taking it one step further to out do the previous video.

don said:

“I must admit I’m perplexed as to why Tesla would allow AutoPilot to function when there is no seat occupancy and/or the seat belt is not in use, especially during a beta.”

You have greatly underestimated the lengths to which this very determined idiot went to, to defeat the safety systems. According to discussion on the Tesla Motors Forum, the idiot had to buckle the seat belt on the empty seat to get Tesla Autopilot to function. Some have said you also have to fool the seat’s weight sensor, but I don’t know if that’s true; you can see even in this brief video there’s nothing in the seat to put pressure on the weight sensor. Of course, that could have been defeated by a bit of electrical engineering.

“Nothing is foolproof to a sufficiently capable fool.” — Murphy’s Laws

“You have greatly underestimated the lengths to which this very determined idiot went to, to defeat the safety systems. According to discussion on the Tesla Motors Forum, the idiot had to buckle the seat belt on the empty seat to get Tesla Autopilot to function.”

Greatly underestimated the lengths he went to? All he did was buckle the driver’s seat belt to defeat the safety system!

I suppose this is somewhat similar to what supposed “White Hats” do when trying to discover computer security flaws. Perhaps we need similar devious testers for finding flaws in complex autonomous software, too?

Clearly, more limitations on how, when and where the driver can utilize AutoPilot and how it reacts, will need to be implemented soon…

As I commented on the video itself, seat sensors are not a good idea. Tesla is already planning to use steering wheel sensors, which are considerably harder to trick than simply weighing down the driver’s seat with a heavy bag.

If one has to keep hands on the wheel at all times, why go with autopilot? Why not save money on the option, and go back to steering and texting?

Do you think a greater number of people will stop jumping in the back seat?

Anti-See Through said: “As I commented on the video itself, seat sensors are not a good idea. Tesla is already planning to use steering wheel sensors, which are considerably harder to trick than simply weighing down the driver’s seat with a heavy bag.” Among the many things discussed on the Tesla Motors Forum, on the subject of Autopilot, is that people have already figured out how to defeat a steering wheel sensor on some other car, by strapping a water bottle to the wheel, to let the wheel “feel” a pull on it. It boggles my mind that people would not only do this, but would actually encourage others to do so by posting instructions on the Internet! I don’t think Tesla can be expected to overcome the efforts of very determined idiots to defeat what should be foolproof systems. I’m shocked that Tesla released Autosteer (Beta) with no foolproofing at all, but when it reaches the level that someone really does have to make a conscious effort to defeat safety systems — which is what happened here — then there is only one entity responsible, and that is the idiot in question. The auto maker cannot in any way… Read more »
That is a very, very naive comment. Have you never heard of product liability lawsuits, which are filed if you are injured by a product that was defectively designed, manufactured, or labeled? In this case the plaintiff’s lawyer will argue that Tesla’s Autopilot was defectively designed because it can function without anyone in the driver’s seat by just buckling the set driver’s set belt. You also sue the “deep pockets” defendant (Toyota) because in the vast majority of states, if not all states, there is “joint and several liability.” Joint and several liability means that if more than one defendant is found to be at fault, then each defendant is liable for the ENTIRE judgement, no matter the percentage of fault assigned to them. In other words if there is a judgement for the plaintiff for $10 million, and the idiot in the video is found to be 99% at fault for causing an accident and Toyota is found to be 1% at fault, both the driver and Toyota are liable for the entire amount of the judgement. Since Toyota is the “deep pockets,” the plaintiff will seek to collect the entire $10 million judgement from Toyota. However, Toyota is… Read more »

Guys I really find this stranger then anything I’ve thought could happen. In that originally I had made a artwork to make fun of self driving cars

The catch that I always thought was that someone would have to be sitting in the driver’s seat to get the car to work. Also I find it amazing that the Tesla would do this.

Almost all comments so far regarding the use of the seat sensor to disable the auto-pilot aren’t thought through properly. Even if you had to be sitting in the seat to activate it, someone could always just use a heavy bag to fool the sensor. In fact there is litterally nothing that Tesla could do that would improve the safety of the vehicle in this situation. Whatever they did someone would find a way round it and at the end of the day the safest way for the car to act in the case of a complete idiot like this, is to drive the car for him until he comes to his senses. The main point to remember here is the car/software isn’t in anyway at fault, in fact it made the best of a bad situation. This idiot did something incredibly stupid and dangerous and should probably spend a little time in jail for it, but the car did the right thing (drive the car as best it could). Don’t forget you can’t stop everyone from being idiots, he could have got into the back and then driven the car using rope and broom handles. Not a good idea,… Read more »

Having more than one sensor to fool is legal proof that demonstrates the driver knowingly engaged in deceiving its intended use of operation.

It’s not about whether or not it can be fooled, it’s about having enough deterrents to demonstrate legally that the driver knowingly engaged in stupid behavior, and Tesla does not share liability given the hoops they had to go through to force the car to drive by itself with nobody in the driver’s seat.

It’s also about making it more difficult (and therefore less likely) for someone to do a stunt like this. The seatbelt sensor is easy for anyone to defeat, anytime. If the driver was required to maneuver around a 25 kg weight as well, they might not have done it. Yes, they could go to the trouble if they really wanted to, but it would not be spontaneous. It would require more preparation and determination, which means it would be less likely to happen.

Right, yet in this particular video, the seat belt light is blinking, suggesting they didn’t even have to circumvent that sensor.

Or maybe the blinking seat belt indicator shows the idiot was sitting in the passenger’s seat without using the seat belt, after sliding over from the driver’s seat and re-buckling the seat belt on that seat so he could activate the Autopilot in an incredibly dangerous stunt, endangering not only himself, but everyone else sharing the roads with the idiot.

I can’t speak from experience, but certainly it has been said in discussion on the Tesla Motors Forum that you have to have the driver’s seat belt buckled before you can activate Autopilot.


TESLA!! Please add the seat weight sensor requirement to the software asap.

I don’t want to lose this great feature due to the ‘curious’ users who make their cars do this and then the lawyers who will try to go after Tesla.

This video also proves that someone could die while driving the Tesla and the Tesla would drive the dead man to there destination.

Major mistake by Tesla not to have a weight sensor in the driver’s seat to enable/disable this feature. Cars already have a sensor in the passenger seat for airbags. This autopilot appears rushed to market without accounting for the idiot factor-which other carmakers take the time to vet.

People could just put weight in the front seat so that doesn’t do anything more than making sure seatbelt if fastened.

I am a firm believer in Darwinism. Let nature take its course. This individual needs to be removed from the gene pool and will undoubtedly be soon enough. The tragedy is that he may have already pro created.
I’m sure Tesla will correct but really, I’m also sure they assumed (wrongly) that the average Tesla customer was smarter than a cock roach, which obviously this individual isn’t

EVGuy As long as you are the innocent bystander they kill in the accident, I have no problem with your position on cleaning out the gene pool. You can put your own life on the line for your own elimination from the gene pool as much as you want.

Can you guarantee that it will be you? If you can’t guarantee that it will be you that is killed, who are you to speak on behalf of whatever poor innocent bystander is killed by “letting nature take its course”?

New Tesla seat and hands on the wheel sensors.

New video with new tricks to override that.

Who cares. Peoples does whats peoples does.

Amazing that you are so Laissez Fair with other people’s lives- the ones not in the Tesla.

Amazing that you are so easily amazed.

Stop being stupid, people.

This is why we can’t have nice things.

“The difference between stupidity and genius is that genius has its limits.”
— Albert Einstein

. . . while stupidity knows no bounds. 🙁

He has a Tesla, he has a nice thing

They need eye sensors, to see if there is a driver, and if that driver is awake and paying attention.

Yes, this technology exists.

You can’t discriminate against robots that need to commute to work and back. 😉

“…such as sensors in the steering wheel to keep drivers where they are supposed to be while Autopilot is engaged.”

That’s not necessary and too complex. A simpler and more elegant solution would be pressure sensors in the seat that would not allow the autopilot to be engaged unless there was someone actually sitting in it. It would be similar to the sensors used in passenger seats to disengage the passenger airbag.

Ok, so apparently I was not first with the seat sensor idea. However, to all those who think it wouldn’t work, I would argue that there is no foolproof/idiot-proof method of stopping some dumb@$$ from pulling a stunt like this by fooling the sensors. However, anything, be it seat sensors or steering wheel sensors, would be a significant improvement over nothing and would cut down on the number of people who try such a stunt by a lot.

As for the reason why I think seat sensors are more effective: if the steering wheel sensors require your hands to be on the steering wheel, then that defeats the point of autopilot in the first place. If it could be designed somehow to not require your hands (i.e. some sort of IR sensor) then it wouldn’t need to be on the steering wheel and would need to account for things like people leaning over and to the side. That is a bit more challenging than a pressure sensor in the seat.

I haven’t watch the video as everyone seem to say it’s an idiotic piece.

I heard that some luxury car have heart beat sensor to prevent their delicate driver to any heart attack.
Is this a possible idiot defeat feature?

Quite literally any safety system can be defeated by a sufficiently imaginative fool. A heartbeat sensor could be fooled by an audio recording of a heartbeat, played on a good sound system.

“Nothing is foolproof to a sufficiently capable fool.” — Murphy’s Laws

The story of the guy that used a wild raccoon to bypass the breathalyzer in his car comes to mind. (even if it was a hoax)

I start to think this all affair of self driving features is starting to seriously go out of hands. If Tesla can make a 500 miles range ev, well they should do it, at least that is something useful and really needed, but this self driving stuff is not what we need. So they should just switch back to working on more range before anyone gets hurt in a very real accident. And if someone says it is only an option, well than let the 500 miles range also be only an option.

It’s understandable that kids like to show off with a “Look Ma, no hands!” stunt when they’re kids. We would hope that those adult enough to have a driver’s license would be old enough not to do the same.

To paraphrase what one InsideEVs commentator said in response to an earlier article on a similar subject:

This demonstrates that, limited though it is, Tesla Autopilot is already a better driver than some humans.

I mean isn’t this how its going to be in the future?

Isnt this what everybody wants? Lol

Now we are going to see more pranks like this.

The ethics here are rather simple: Tesla released software which is easily used in a dangerous manner. Just like constructing a pool in the backyard without a fence, Tesla is liable for deaths from trivial misuse of this feature. They should learn from other manufacturers, like Mercedes, which have added reasonable safeguards to their systems. Sometimes I don’t know if Tesla has any clue at all.

In the lawyers Mecca, the US, I sense legal disasters with self driving vehicles