Tesla Model S Owners Post Autopilot Videos

OCT 16 2015 BY ERIC LOVEDAY 52

Look Ma, No Hands!

Look Ma, No Hands!

By now, we’ve all seen some Autopilot videos and read Tesla’s recap of the technology, but most of those we’re from controlled media drives of the Tesla Model S with firmware 7.0.

How about we look at some Autopilot videos posted (below) by Model S owners.

We think these videos present a more realistic representation of Autopilot functionality in the real world.

*Editor’s Note: Several of these videos show improper use of Autopilot. According to Tesla, the driver’s hands must remain on the wheel at all times. Furthermore, Autopilot is only designed for highway/interstate use, not for back roads, side streets etc.  Lastly, the “driver is still responsible for, and ultimately in control of, the car.” Please use any and all autonomous-driving features correctly.

On that note, here’s video of Elon Musk explaining Autopilot and responsibility:

On to the driver videos:

Video description:

Demo dead man’s curve.

Video description:

Finally getting to use and try out Tesla’s autopilot functionality with my P85D.

Works pretty well, but still definitely needs to be babysat outside of the interstate or similar roads. Don’t take your eyes off of the road and be ready to override it. Be safe!

Was able to do one of my normal interstate highway drives without ever touching the wheel or pedals though. 🙂

Video description:

I am the proud owner of a 2015 Tesla SP90D, purchased with all available options. It is the best car I have ever owned and I love it dearly. I also own a large chunk of Tesla stock. Today my car received the anticipated version 7 software update and I was anxious to try it out near my home. After several seconds of successful hands-free driving, I admit I started to ignore the warning to keep my hands on the wheel so that I could record the moment to share with friends. That’s when all hell broke loose.

My car was tracking the car in front of me to successfully steer, but I was going slower and that car eventually drifted far ahead of me. Shortly after that, another car was coming in my car’s direction from the opposite side of the road. I can only guess at what happened next. My car apparently thought the oncoming car was supposed to be followed, and suddenly veered to the left, crossing the double-yellow road divider line right into its path. Had I not reacted quickly to jerk the steering wheel in the opposite direction, a devastating head-on collision would have occurred.

I post this in the hopes that it will prevent any losses to anyone using AutoPilot in similar circumstances and in the sincere hope that Tesla can address the issue as soon as possible if the software can somehow be improved in detecting both oncoming vehicles and cross-traffic lane dividers to avoid steering into oncoming traffic.

Video description:

Testing the Tesla Autopilot the morning of its release.

Video description:

Brief demo of the Tesla Autopilot’s Auto Steering and Traffic Aware Cruise Control.

Video description:

Autopilot beta was generally released today. I had enough time to download new software and try it out…. It work! Bit wobbly but it works. Impressed!

In response to seeing a flood of Tesla Autopilot videos hit the Internet, InsideEVs contributor Michael Beinenson created this humorous video of the “unofficial autopilot” mode found on the Nissan LEAF:

Categories: Tesla, Videos

Tags: , , ,

Leave a Reply

52 Comments on "Tesla Model S Owners Post Autopilot Videos"

newest oldest most voted

Knee-san Leaf, hands free driving.

LOL!

Evidence is mounting that firmware update 7.0 is not ready for prime time. The best example yet can be found over at Road & Track, where a guy in Portland is showing off Autopilot and it’s working great on the freeway until he takes the offramp and the lines of the road change. Ironically, his passenger is saying a disclaimer should pop up that says, “This can kill you” when one second later the Model S jerks the steering wheel practically out of the driver’s hands and heads for the guardrail! I’m not great on Twitter. Can someone reading this that loves electric cars and Tesla please Tweet Elon Musk and tell him he needs to pull back 7.0 with an update that puts it on hold until they can address these issues?! I’m not kidding. One thing about successful companies is they can get a bit arrogant. I just called Tesla Tech and the guy says – “Hey, I’m aware of it, BUT YOU ASKED FOR IT!”… I asked if there was any talk there of pulling it back and he says, “Hey, it’s already out there, it’s in customer’s cars”. I reminded him that when an “Auto-Lowering” Tesla… Read more »

James said:
“This literally could kill Tesla. No amount of disclaimers or operator instructions can protect them from the ONSLAUGHT of sensational stories once this kills someone or injures innocent people!”

It’s like a Tesla short seller’s wet dream! 🙁

I think it is a game changer. I love it.

There is no way that a traditional, conservative automaker would do a wide roll-out of a beta product that actually steers your car for you. It’s fascinating the way Tesla operates. Exciting and a bit scary.

+1

Yep those Tesla employees really do have balls like coconuts!

It is worth remembering though, snooze you lose. Sometimes you have to take the risk, in order to build something genuinely amazing. The data that Tesla will have got in the first week, will out strip that of all other car makers. If they can keep people from killing themselves long enough, this will quickly grow into THE selling point for Tesla.

The quickest way to develop anything is to get people using it asap, then refine it according to feedback.

Oh man… If we’ve already had a near head-on collision on the first day, we’re definitely going to have an accident soon and Faux “News” will be all over it. Ugh. It’s very scary that it would decide to drive over a double yellow into an oncoming car despite both forward radar and the camera that’s supposed to identify road lines. I think this indicates there’s serious problems with the system (or a malfunction in that particular car’s sensors) and that Tesla would do well to recall the feature and do more development.

Even if the system were better I can’t help but wonder if these systems are actually making us less safe… In the comment thread on a previous article, an SUV owner backed up into another car at a gas station because they didn’t bother to look, then said “The car was supposed to stop before it hit anything!” Despite all the warnings that drivers are ultimately responsible, too many people will be lulled into trusting these systems and when they fail it can be catastrophic.

The problem with crossing the double yellow line isn’t the fault of the autonomous driving software it was the fault of the driver.

Tesla clearly states that the current version should not be used anywhere but on a divided freeway with all traffic heading in the same direction. The same goes for driving in town. It currently isn’t designed to distinguish traffic lights etc. If someone posted a video of the software failing in an urban scenario it would only show it was the drivers fault for misusing the autopilot features.

Of course people will misuse the autonomous driving features but Tesla clearly states that it isn’t designed for those driving scenarios. People also drive with their phones in their faces while texting and eating hamburgers while driving with their knees.

If doing any of these while driving results in an accident whether it be phone, hamburger or improper use of autopilot then those involved should be prosecuted to the fullest extent of the law.

“Tesla clearly states that the current version should not be used anywhere but on a divided freeway with all traffic heading in the same direction.”

Why isn’t Tesla’s Autopilot programmed to detect and identify that the car is not on a divided freeway? Why isn’t Autopilot programmed not to engage or safely disengage the Autopilot when it detects that the car is not on a divided freeway? The car should know when it’s not on a divided highway even without the input of the radar, camera, and sensors, by using its GPS and mapping data.

The problem with crossing the double yellow line IS the fault of the autonomous driving software, because the double yellow lines should have been a dead giveaway to the Autopilot software that the car is not on a divided highway.

In September 1988, two Akron, Ohio-based carpet layers named Gordon Falker and Gregory Roach were severely burned when a three and a half gallon container of carpet adhesive ignited when the hot water heater it was sitting next to kicked on. Both men felt the warning label on the back of the can was insufficient. Words like “flammable” and “keep away from heat” didn’t prepare them for the explosion. They filed suit against the adhesive manufacturers, Para-Chem. A jury obviously agreed since the men were awarded $8 million for their troubles. Gosh yes, sven, you’re right. Clearly it was the fault of Para-Chem for not installing a detector on that container which would detect the presence of an open flame, and cause little legs to come out and scamper the can away to safety. [/sarcasm] * * * * * Part of growing up, part of becoming an adult, is accepting responsibility for your own actions. When we try to blame others for the results of our own foolish or careless behavior, despite warnings, then what we’re saying is that we think we should be treated like children, not adults. And, sven, as they say: “This is why we can’t… Read more »

+1

and let me add the following:

If I buy a chainsaw and ignore all warnings and instead choose to use my mobile phone while cutting down a tree..oops leg… Well OK. I think by now everyone got the point.

I started worrying about this whole responsibility discussion when they started to print “Warning! Coffee can be hot!” We are lost… We are completely lost….

I really hope that people use AP in a responsible way. However, I really doubt that this will be the case.

All the people complaining that AP does not work in situations for which it is not built show just one thing: AI is already superior.

Heisenberght said:

“All the people complaining that AP does not work in situations for which it is not built show just one thing: AI is already superior.”

Dude, you just won the Internet! 🙂 🙂 🙂

Wish I’d said that.

A trivial amount of programming is needed to have the Autopilot not engage when the GPS mapping determines that the car is not on a divided highway or when the sensors detect a double yellow line to the left of the driver’s side front wheel. Toyota didn’t have a brake pedal override of the gas pedal to prevent unintended acceleration, and it cost them billion of dollars in fines, lawsuit judgements/settlements, and lost sales. Afterwards, Toyota belatedly made a simple software update to the ECUs of Toyota cars to give them a brake pedal override function. Likewise, Tesla’s Autopilot exposes Tesla to a potential multi-billion dollar liability that can be easily prevented by a low-cost OTA software update. Only a fool like you would think it is wise not to implement this software safeguard. Hopefully, Elon will have an epiphany before someone gets killed by Tesla’s Autopilot. Trollmi-Trollyu said: “And, sven, as they say: “This is why we can’t have nice things.” Because all too many people aren’t adult enough to accept responsibility for their own actions. Sadly, you seem to be one of those.” I would like Tesla’s Autopilot to be idiot proof, so that people like you can… Read more »

EDIT: I should have watched the video instead of trusting the description… but what actually happened is the car decided it didn’t know what to do and disengaged auto pilot along with a warning sound. For some reason the wheel also turned left significantly while the warning beep was sounding but it’s not clear why. MAYBE the collision avoidance system that’s always on despite auto pilot turning off would have actually steered away from the oncoming car if they had gotten closer… Either way, the ultimate problem is people ignoring instructions to keep hands on the wheel at all times and putting too much trust in systems that are far from perfect. Sun glare, harsh shadows from trees, weather wear, and especially headlights on a rainy road can make road lines almost impossible for any camera to distinguish in some cases and there will always be times where the system can suddenly fail. This dream of 100% reliable self driving cars seems very unrealistic to me without some major advancement in sensor technology.

It’s AI that’s missing, not senors. The Model S’s sensors are already far superior to human senses. The problem is that the lights are on, but nobody’s at home, as far as the software goes.

Ironic that your name is Dragon. My computer is one year old, and I believe it came with one of the latest versions of the Dragon voice-control software installed. It’s such a joke, it’s not even funny. I try to dictate to it and it gets half of my words wrong. If we have been struggling with voice-control software since the mid nineties and still it’s overtly flawed – never use voice in the car if anyone is talking – and some still struggle with wind noise and all background noise…Then how the heck is autonomous software that is supposed to take hold of your entire drive from start to finish going to work. There are redundencies in NASA rockets and they still fail. There is just no way I am going to let go of the wheel and let this stuff drive for me. The liabilities are enough to sink Tesla. I totally agree. This stuff is not for prime time. There is already another video not shown here, where the guy is doing fine with Autopilot on the freeway, but takes an offramp and his Model S tries to jerk him into the guardrail! Tesla has to pull… Read more »

Yes, your Dragon voice recognition software is awful. Funny, that, considering my Cortana and Google Now are REALLY accurate, even in rooms with ambient noise. Stop being so pessimistic. The technology can easily be ready for prime time in a decade.

There ARE obvious exceptions. Here in the UK for example you have a lot of roundabouts that lack lane markings, or where lane markings disappear. To provide an example, can anyone guess how autopilot could function on the Arc de Triomphe? There will always be situations where a driver input is needed, but for 99% of roads we CAN have autopilot and driverless ready by 2030.

Dragon said: “This dream of 100% reliable self driving cars seems very unrealistic to me without some major advancement in sensor technology.” We don’t need “100% reliable” self-driving cars. We need self-driving cars which have a lower accident rate than human driven cars. Humans don’t perceive the world the same way that cameras and radar do. The human brain doesn’t work the way computers do, and the way humans make driving decisions will be different than the way autonomous driving software does. If the goal is to make sure that no autonomous car ever makes a mistake that a human wouldn’t make, then we might as well give up now. But practically, that shouldn’t be the goal. The goal should be a lower overall accident rate. Now, that said, there almost certainly is room for improvement in the lane-keeping feature of Autopilot. From what was reported in posts above, it appears that for some reason, the Autopilot jerked the wheel sharply as it disengaged, after the driver took the exit ramp off the freeway. That seems to be a glitch in the system which needs to be corrected. However, it still was a misuse of Autopilot, because the driver should… Read more »
Dragon said: “It’s very scary that it would decide to drive over a double yellow into an oncoming car despite both forward radar and the camera that’s supposed to identify road lines. I think this indicates there’s serious problems with the system (or a malfunction in that particular car’s sensors) and that Tesla would do well to recall the feature and do more development.” Okay, I give up; yes, Tesla should update the software. They should add a screen that follows the driver telling the car to update the software to enable Autopilot. The screen should read: ~~~~~~~~~~~~~~~~~~~~~~~~~~~ Before we activate Autopilot for you, please answer the following question: Can you use Autopilot anywhere other than on a multi-lane divided highway, where all traffic moves in the same direction? (A) No, I should never use Autopilot anywhere else, as it would pose a real danger not only to myself, but to others on the road. (B) Well, if I’m really eager to try out Autopilot, then it’s okay to ignore the danger and use it on a two-lane road or highway, so I can be the first to post about it to my friends online. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ If the user chooses… Read more »

Re: Dragon

Read the Youtube comments, the guy was at fault for, not the car. End story.

Unfortunately it’s not the end of the story. Sooner or later (probably sooner) someone will get into an accident. And despite it being their fault for ignoring Tesla’s instructions, the conservative arm of the media is going to cherry pick the facts and make it Tesla’s fault. In some ways I think Tesla is being irresponsible with releasing this feature at a large scale instead of very gradually to see what actually happens in the real world. They also released that marketing video of a guy driving around city streets where they repeatedly say it’s not meant for use on city streets… yet here’s an official Tesla video demonstrating how cool it is on city streets. Then they wonder why someone thinks it’s safe to use on a two-lane country road? They’re totally ignoring human nature. Really, I don’t this will end well…

No, as a Tesla fan, I have to say, this feature should have never been released if on the first day it almost kills someone. What if this guy was on a freeway and a drunk driver was going in the opposite direction. Elon is pushing it, if someone gets killed, it’s going to be terrible PR for Tesla and for the autonomous driving initiative. Why is this autopilot needed anyway if you need to pay complete attention to it? I like what google is doing more, they are either going all the way or nothing.

Is there any way to prevent someone who is using your car from activating these features?

It has been shown time and time again in commercial aviation that autopilot features breed complacency and render pilots unable to deal with situations when the autopilot disengages. Pilots lose their skills gradually over time. The Air France crash is a prime example. Arguably, aircraft would be safer if no autopilot existed at all.

You obviously don’t know what you’re talking about.

Yep. This is known as the automation paradox.

The automation does better in the general case, but the lack of practice makes the pilots worse. This comes up when the pilots have to take command in emergencies.

That’s true. Though the Air France crash is a poor example as that was a air pressure sensor freeze that was reporting their speed.
But there have been overflies, where the pilots went past their destinations simply due to inattention.

Tesla’s mapping system probably knows which roads are 4-lane-divided. If so it could probably make the determination of when auto-pilot is appropriate to activate.

If I were Elon, some of these videos would make me cringe, so I might temporarily yank the feature.

It sounds like misuse will case some accidents fairly quickly…

And then we’ll get to here what the insurance companies think about it. I wonder if Tesla & the other companies working on this consulted with them…

All the monkeys are playing. Some might get hurt because of misunderstanding the Autopilot’s current limitations and the proper environment where to activate it. We’ll see…

Gutsy decision for a small startup? Not so much when they’ve carefully defined the operator as the risk taker at this stage of development. I think would be harder to pull off if they were a well established automaker.

Two examples of Mercedes Benz autopilot getting misused when it came out:

http://www.liveleak.com/view?i=507_1420300201

So how old can Tesla Model S be and still use this update? When did they start putting in the steering control and the needed sensors?

Cars with the appropriate HW started being delivered about this time last year–a little bit before the official P85D and Autopilot announcement was made.

October 2014 Model’s

Good luck to the beta-testers of this feature.
Be safe. Through this Tesla will be able to refine cases. Like don’t ever auto follow on a lane change, and disengage auto follow after a
certain distance between vehicles is reached depending on speed.

I foresee insurance premiums increasing

If it’s only meant to be used on interstates then Tesla should prevent autopilot from enabling anywhere else. Each car has GPS to validate surface terrain

Seems to me there should be a much simpler solution. For example, if you drive on a freeway using cruise control, that is usually disengaged when you take the exit ramp, because you tap the brake to slow down. There should be a similar simple way for the car to recognize that the driver has done something to indicate he’s left the freeway.

I’m not saying that using GPS coordinate to identify when a driver is (or isn’t) on a limited access highway is a bad idea; I’m just saying that will be a lot more complex to implement, and with greatly increased complexity comes a great deal of potential for errors. Can you say “Apple Maps”?

People that are worried about Autopilot misuse have obviously never paid attention to their fellow drivers while they are driving around. 🙂

I’m usually too busy texting to pay attention to my fellow drivers while I’m driving around. 😀

I think all cars should have it.

Evidence is mounting that firmware update 7.0 is not ready for prime time. The best example yet can be found over at Road & Track, where a guy in Portland is showing off Autopilot and it’s working great on the freeway until he takes the offramp and the lines of the road change. Ironically, his passenger is saying a disclaimer should pop up that says, “This can kill you” when one second later the Model S jerks the steering wheel practically out of the driver’s hands and heads for the guardrail! I’m not great on Twitter. Can someone reading this that loves electric cars and Tesla please Tweet Elon Musk and tell him he needs to pull back 7.0 with an update that puts it on hold until they can address these issues?! I’m not kidding. One thing about successful companies is they can get a bit arrogant. I just called Tesla Tech and the guy says – “Hey, I’m aware of it, BUT YOU ASKED FOR IT!”… I asked if there was any talk there of pulling it back and he says, “Hey, it’s already out there, it’s in customer’s cars”. I reminded him that when an “Auto-Lowering” Tesla… Read more »

From that video it looks as if the the autopilot was following the curb, as it sweeps away from the road, so yeah, another thing they need to address.
I am not particularly concerned that its not ready to completely take over driving the vehicle, though they certainly need to work on the glitches.

Exaggerate much?

(Quoting from the article:) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Today my car received the anticipated version 7 software update and I was anxious to try it out near my home. After several seconds of successful hands-free driving, I admit I started to ignore the warning to keep my hands on the wheel… My car was tracking the car in front of me to successfully steer, but I was going slower and that car eventually drifted far ahead of me. Shortly after that, another car was coming in my car’s direction from the opposite side of the road. I post this in the hopes that it will prevent any losses to anyone using AutoPilot in similar circumstances and in the sincere hope that Tesla can address the issue as soon as possible… ~~~~~~~~~~~~~~~~~~~~~~~~~~~ [end quote] So, not only did this idiot Model S owner ignore the warning about keeping your hands on the wheel, he also ignored the much more important warning that Autopilot is only intended to be used on divided highways, where no oncoming traffic should ever be encountered. What’s really dangerous about this behavior isn’t the idiot endangering his car and his own life. What’s really dangerous is that he was so… Read more »

Did Tesla get that software from Cyberdine? Or Skynet? LOL

Very cool Auto Lane Change

I don’t want my family to die due to Tesla owners that are using this new technology without understanding its limitations, therefore we are staying away from Tesla cars on the roads as much as possible.

I think this one is much risky. We cannot foresee things that happened on the road, hence if I were to choose, I would still pick for manual system cars.