Tesla Model X Veers Crashes Through Guardrail – Autopilot Blamed (Update/Tesla Statement)


Autopliot is again being indicated potentially at fault for a wreck involving a Tesla Model X that reportedly veered off the road, hit a guardrail held up by wooden stakes and then plowed through a fence and stopped somewhere out in the open field.

However, Autopilot was misused in this situation, as the system is not designed for non-divided, two-lane roads like that one pictured below where the wreck occurred.

Street View Image of Where Tesla Model X Crash Occurred

Street View Image of Where Tesla Model X Crash Occurred

As forum member Eresan on Tesla Motors Club Forum explains:

“Both 2 people on car survived. It was late at night, Autopilot did not detect a wood stake on the road, hit more than 20 wood stakes, tire on front passenger side and lights flyed away. The speed limit is 55, he was driving 60 on autopilot. His car is completely destroyed. The place he had accident does not have cellphone signal, it is 100 miles from the hotel. We are on a 50 people Wechat messenger group. I woke up today saw he managed to get internet to ask people in the Wechat group to call Tesla for assistant.”

The driver (Mr. Pang) told CNN he wasn’t actually sure whether it was the car’s fault or his.  The news organization said the drive was “eager to talk to Tesla and learn why the car swerved off a narrow Montana road” and that “Pang said he did not receive any warning from the car that he was in danger and needed to act, adding that the warnings from his car were in English, and that he speaks Mandarin.”

The incident resulted in a traffic citation being used to the owner for careless driving following the accident.

Tesla Statement on the incident (via Electrek):

“This vehicle was being driven along an undivided mountain road shortly after midnight with autosteer enabled. The data suggests that the driver’s hands were not on the steering wheel, as no force was detected on the steering wheel for over 2 minutes after autosteer was engaged (even a very small amount of force, such as one hand resting on the wheel, will be detected). This is contrary to the terms of use that are agreed to when enabling the feature and the notification presented in the instrument cluster each time it is activated.
As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel. He did not do so and shortly thereafter the vehicle collided with a post on the edge of the roadway.
Autosteer, which is enabled via the Driver Assistance tab under Settings, is best suited either for highways with a center divider or any road while in slow-moving traffic. We specifically advise against its use at high speeds on undivided roads.
That said, provided the driver remains alert, it will still be safer than a person driving unaided, as people are sometimes distracted or may become unable to operate the vehicle, due to falling sleep, fainting or experiencing a medical emergency. After either high lateral acceleration from a sharp corner is detected or there is no force on the steering wheel, the vehicle gradually reduces speed, stops and turns on the emergency lights.”

Moral of the story…don’t use Autopilot how and where Tesla says you shouldn’t. The system is not designed for use in this situation and therefore can’t be blamed for causing the crash.

Sources: Tesla Motors Club Forum, Electrek, CNN

Categories: Crashed EVs, Tesla

Tags: , , , ,

Leave a Reply

58 Comments on "Tesla Model X Veers Crashes Through Guardrail – Autopilot Blamed (Update/Tesla Statement)"

newest oldest most voted

Everytime I read one of these stories I find myself repeatedly in denial at the staggering level of stupidity,

These people must have the intellectual capacity of a boiled potato ?

Keep your hands on the wheel for the next decade at least !

Well said Alan,…Some people do drive their cars like the total f00ls that they are……

It was late at night too. What do you bet there was some nodding off behind the wheel happening.

“The difference between genius and stupidity is that genius has its limits.”
—Albert Einstein

The driver had the brim of his baseball cap covering his eyes as shown in the first pic. That might be a contributory factor in this accident. 😉

He might also be sleeping.

That guy might also just be someone from the tow truck crew.

“Intellectual capacity of a boiled potato.” Nah, more like hubris. Bet it’s been pierced by Pangs of regret by now….

Why doesn’t autopilot check its gps location and not turn on when it is on a road where tesla says it isn’t safe to operate?

If we’re going nanny driving, let’s go the full monty and not just stop with backseat nagging….

Good point!

Agree – if people are going to be stupid – the software needs to adapt and not let it engage.


It doesn’t even need GPS for that, it can probably tell just by looking at the road that it’s missing some of the pieces it requires for safe operation…

That post is a great example of now so many people are vastly overestimating the ability of Tesla Autopilot at this early stage of development.

As a reminder, even after some updates, Tesla Autopilot/AutoSteer still sometimes mistakes a freeway off-ramp for the main road.

So no, Autopilot/AutoSteer absolutely cannot tell the difference between a divided highway and an undivided one, just by “looking”. It’s nowhere near that capable in this early stage of development.

Alan, maybe you could say something about Tesla launching a beta program that is resulting in accidents that otherwise would not have occurred because Tesla doesn’t seem to realize that some of the drivers participating in their beta program have the intellectual capacity of typical consumers that don’t read the fine print.

Vehicles with Autopilot will ALL be beta programs for the next decade and should be treated as such, you don’t need to be a rocket scientist or read the small print to work that one out !

Caveat Emptor !

Unless of course your sport of choice is Russian Roulette !

There’s more than fine print. The car will constantly alert you if you aren’t keeping your hands on the wheel.

Cleaner, We don’t have any data to conclude AP has taken lives yet and afaik, none have been taken because of AP. Some have lost their lives while using AP outside of the allowed parameters so none can be blamed because of it. But until all cars go autonomous, it will happen, no doubt about it. I am certain more lives are saved because of AP but there is no way to gather data about that. Until we have statistics of accidents occuring before AP and after, there is no possible conclusions. But i’m willing to bet that data will favor AP when used inside allowed parameters. history will tell.

Key difference between USA and rest of the world:

USA: The car is to blame because it doesn’t nanny the driver the entire time by force-disabling itself where appropriate. The car is to blame because it’s experimental software and it’s not yet ready for the sheer stupidity of some drivers. Tesla shouldn’t have released it, it’s dangerous and puts innocent people at risk.

Rest of world: The driver uses autopilot at their own risk. Ultimately they are responsible for the vehicle and held accountable if they crash. The driver must obey all warnings and only use the system as directed. If they fail to do so and someone gets hurt, the fault lies with the driver for being an idiot.


According to Musk we will have fully autonomous cars in 2-3 years…. yeah, no!

What if Tesla and SpaceX goes ahead in the next months with the launch of the 4000+ mini satellites for better GPS localisation. that could change things in a drastic way. I guess we will know soon enough. But for 8 years now, i’ve learned not to doubt Musk affirmations. The timelines, not so much.
I can’t wait for part 2 of the secret.

GPS is only good for 2-3 m accuracy at best, normally its more like 7-20 m accurate. You need 10-20cm accurate location for autonomous driving.

This is due to satelize ziming uncertainty an can not improved with more satellites.

I know what the system is allegedly ‘designed’ for…but I also know how the system works. I am beginning to think a certification is required or at least an online test. The system wants to SEEEE….PAINT! And it requires CONSTANT SPEED. If you can’t SEEEE paint, the car can’t either. And many roads are inappropriate for constant speed. Twisty turny roads ARE OUT. And high speed highways are banked to accommodate higher speeds in the corners. There a couple of SECTIONS of my local roads that I know are safe for autopilot, but I have analyzed them only after I have driven them… …and since autopilot is ‘learning’ what roads are safe, with good paint, etc. it now appears that Tesla’s best approach is to analyze our driving and start approving roads only after evaluation. When you look at your screen and don’t see nice solid lines, this should be telling Tesla not to approve a road for autopilot use. You don’t have to actually turn on autopilot to see if the road is good for autopilot- your instrument display is already telling you…but I figured that out on my own and I am far more analytical than the average… Read more »

The pictures weren’t originally showing when I made this comment…now I SEEEE the lousy paint job on the sides of the road, with grass growing right up next to those lousy lines. I have learned that autopilot visualization on this kind of road is awful (again, not while actually using autopilot- I just keep looking at the lines on the screen). Good autopilot usage requires a shoulder with a well demarcated side line. I do believe Tesla should be able to fine tune its software…and it had better, to avoid class action lawsuits from people like this idiot driver.

“And many roads are inappropriate for constant speed. Twisty turny roads ARE OUT.”

-There’s already been a “big brother” update, to slow cruise speed based on steering angle. It applies with autosteer off, as well.
-Twisty turny roads are also a problem when you crest a hill, during a rightward curve. The car no longer SEEEs paint, and can end up tracking the car in front of you. Only problem is it could be the the one coming toward you. This has happened to me a couple times, when I knew I was “risking” a two-lane.

I remember another driving instructor once taking my steering wheel, at ~60mph, from the passenger seat. It’s how quickly you have to take over, even if your hands are on the wheel, with an amount of torque that disengages AP, while not steering the car into a worse situation. I’m making it sound harder than it is, but some situations can require more precision.

We’re trusted to gauge our use of a Tesla for this purpose, but explore the grip of its tires and you get SHUT DOWN. Seems crazy, to me.

Maybe InsideEV’s should have a separate section up on the top of their website labeled, “Tesla Accidents”?

Or a “Tesla you’ve been framed” section where the owners of their wrecked vehicles get $250 for every video showing their cars disappearing off the road and getting written off ?

I can’t identify the logic of every choice AP has made in my hands.

If Tesla doesn’t respond right, AP will be shackled to a stop&go automation, on barrier highways only, and auto-steer will shut down above 30mph. It’s all just a flash away.

From what little I know / have read about Tesla’s AP …

I would think Tesla is going to have to implement some sort of training program for “unrestricted beta testers”. I suppose there’s a host of complications and legalities that would go along with this, so I can understand why they would want to avoid it. … but….

Otherwise, I think they may have to restrict AP usage substantially, and all the fantastic beta gathering info will be eliminated.

Could you think of what would have happened if this Tesla Model X would have hit someone head on on a two lane black top and killed someone?

I’m really thinking that a lot of these people putting the car in auto drive are falling asleep behind the wheel or watching movies or texting and not paying attention to the road.

How long before a pedestrian gets taken out ?


Alan, How long? About 8 minutes.

Cars without any sort of autopilot features injure pedestrians once every 8 minutes somewhere in the United States:

“in 2014, there were 65,000 reported pedestrian injuries in 2014; nearly one injury every 8 minutes”

Do you actually have any data suggesting that autopilot CAUSES pedestrian accidents where they wouldn’t have happened with a standard car?

It constantly amazes me how people can get so verklempt at the idea of people being injured in accidents, only after an EV does it. But have for decades just accepted it as normal when gas cars get in accidents constantly.

I find it quite disheartening that what should be obvious to everyone, has to be pointed out so often.

It’s shocking how many posts on the Internet — not just on InsideEVs — show a distinct lack of critical thinking.

Perhaps you could regale us with your critical thinking exacltly who was suggesting that pedestrian accidents don’t occur regardless ?

All accidents are preventable if humans were paying attention in the first place or were not speeding and has nothing to do with EV’s but it certainly does have a lot to do with beta testing autopilot and trusting a technology which is at the very least still in it’s infancy !

Other car manufacturers understand that their drivers are stupid, Tesla should too.

Money does not = brains

I’m not a lawyer, but doesn’t tort law require a company to assume their customers are “stupid “?

I’m pretty sure you could train a chimpanzee to sit on your lap and steer. He’d probably handle most conditions OK but you wouldn’t want to trust your life to him.

Tesla should call their autopilot “Trained Chimp Mode”.

As I recall a wiseacre in Los Angeles let his chimp steer his convertible down an expressway and took pictures of it. This was perhaps 50 or 60 years ago!

I’d personally never trust a chimp, especially a male one. Any animal that tears your testicles off in a fight shouldn’t be in the car.

IMO Autopilot is a good system with a bad name. If it was called ‘advanced cruise control’ none of the stories would get any oxygen. The model s and X are not self driving cars, they are just a regular luxurary car in this respect.

Tesla is going to get themselves into a huge class action suit for releasing Autopilot before it was ready. Hope it doesn’t bankrupt them.

They’re damned if they turn it off, too.

The pressure is getting higher, but imagine if someone gets killed outside the car, while AP is engaged? It’s possible to win a Darwin, by running in front of a Tesla. Everybody will assume “AP did it”.

This is far from over.

In the article, Eric Loveday said: “However, Autopilot was misused in this situation, as the system is not designed for non-divided, two-lane roads like that one pictured below where the wreck occurred.” I disagree. Tesla is sending out more Mixed Messages. In the Autopilot v7.1 Release Notes, Tesla specifically permits Autopilot when driving on residential roads and roads without a center divider, but puts in a new safety restriction that automatically limits the Model S’s speed to 5 mph over the speed limit (apparently by using GPS data). TRANSLATION: Tesla says it’s OK to use AutoPilot on residential roads and roads without a center divider, but no faster than 5 mph over the speed limit. Tesla could have just as easily put in a safety restriction that completely prohibited Autopilot when traveling on anything but divided, limited-access roads, but Tesla didn’t do that. Instead, Tesla chose to limit the speed Autopilot allowed when on a non-divided highway to 5 miles over the posted speed limit. In the v7.1 Release Notes, Tesla even touted that it improved Autopilot v7.1 to “keep Model S in its current lane when lane marking are faded.” That improved ability could be interpreted as the reason… Read more »
sven said: “In the v7.1 Release Notes, Tesla even touted that it improved Autopilot v7.1 to “keep Model S in its current lane when lane marking are faded.” That improved ability could be interpreted as the reason why Tesla now permitted Autopilot on non-divided roads…” Well, you could choose to interpret it as Tesla claiming their cars will fly and do fine when driven underwater, but that’s absolutely not what it says, or even remotely close to it. Tesla Autopilot/AutoSteer has trouble seeing the edges of the lanes (or the road) when the painted lines are faded. That is certainly applicable to driving where AutoSteer is intended to be used, which is only on divided highways. You know, sven, it’s really too bad that you had to go so far out in your claims that here you are, as they say, not even wrong. You have some real facts here to use to bash Tesla, which quite obviously is your agenda. But just like all perpetual Tesla bashers, you have become incapable of restricting your bashing to the truth. You always have to go beyond it, to half-truths and outright B.S. It’s like you are no longer even capable of… Read more »

Bite me.

I predict AP will be disabled till Tesla adds more warning prompts…

“WARNING: You assume all responsibility of the vehicle and everything else it damages. Do you agree?”

“WARNING: Tesla is not responsible for any damages incurred due to driver negligence and understanding this is BETA test software. Do you agree?”

“WARNING: All actions will be recorded to prove your Monkey Azz did not follow all, if any, instructions.”

Will never understand why anyone would want their car to drive itself

Stupidest thing ever.

Yes, why would I want to relax with a good book when I could be gripping the wheel and staring at the horizon for hours on end? And why relax at home while my car drives itself to the shop for maintenance when I could sit in a hard, greasy, plastic chair perusing old Motor Trend magazines?

I think Tesla allows autopilot on undivided highways because it wants the data. They admitted they need a billion miles on autopilot to get out of beta. This is a GOOD thing in that it is a convenience for the driver and allows auto-pilot to get better faster so that it is indisputably is safer than most drivers. In the end more lives will be saved through faster improvement and Tesla can be the first driverless version of Uber while beating Apple and Google. It may be a bad thing in that over reliance on autopilot to the point of inattentiveness wmay increase the chance of death or serious injury in the short run. Some liability reducing improvements would be to: 1) Keep allowing autopilot as is on divided limited access highways (those with on and off ramps) but improve detection of stalled vehicles. 2) Allow autopilot on other divided highways with the restriction that hands on the steering wheel checks are more frequent plus added checks when approaching an intersection. 3) On undivided highways, the system only allows lane keep assist that nudges the car back to the center of the lane with a warning when about to cross… Read more »

TL;DR: “You’re holding it wrong.” –Tesla

TL;DR: “You’re NOT holding it.” –Tesla

There is an absolutely huge case of observational bias going on, with all these articles on accidents happening with Tesla Autopilot engaged. There are very few if any articles about how Autopilot saved someone from having an accident… or even saved their life.

It seems to be getting as bad as when there were three fires in a Tesla Model S, all within a few months. This gave a lot of people the absolutely wrong impression that BEVs have a greater fire hazard than gasmobiles, when the reverse is true.

I’m not saying there shouldn’t be media coverage of accidents in Tesla cars, but there should be some effort to put such stories into perspective. Of course, the news media in general is absolutely uninterested in putting things into perspective; they’re a lot more interested in hyping things and making them seem far more dramatic (or more accurately, melodramatic) than they actually are.

Dear Elon,
I’m sorry to tell you this, but some people are just too stupid to properly use the Autopilot feature. You may want to work on determining who can and can’t use it, instead of improving it’s overall driving capabilities.

Never commented on all these collisions before, but you would think that Insurance Companies would notify these car manufacturers, Tesla in particular, that these systems are not ready for Prime Time. I haven’t read over the specific ‘owner’s agreement’, but I would bet at this relatively late date that there are practically no places where Tesla legally REALLY says their system is safe to use. I do know that you cannot drag race in ANY of their vehicles without voiding the warranty – so it surprises me Inside Evs would encourage, or constantly report on what to me is the silly 0-60 mph times, with, of course the 1/4 mile time and speed which ALWAYS has gone with the 0-60 time in the past, which is, of course, never mentioned since all Teslas perform poorly at high speeds, so the unflattering time is never mentioned. Too big a financial risk for the owner to get caught up in the hoopla, and then find he has to buy a new power train on his dime. I’m gloating a little bit since I’ve always said automakers, again Tesla in particular, are pushing autonomous systems way too fast. Tesla’s experience is making other… Read more »

+1 My thoughts exactly !

Here are pics of the wooden posts that the Model X driver struck.


Here is the Google Maps street view believed to be the start of the wooden posts that the Model X driver struck (unconfirmed).


Why does AP not stop when it detects a collision? Or did it? Stories like these just do not make sense.