Tesla Model 3 Road Trip Car Hits Barrier While Reportedly On Autopilot

MAY 26 2018 BY STAFF 54

We’re not yet sure where to place blame, but it seems Autopilot, lack of driver attention and possible ball bearing failure from a few earlier off-road excursions – which You You knew may have large consequences – could each be the culprit here.

The famous Tesla Model 3 Road Trip has come to a screeching halt as the car hit a barrier while in Europe. It may be awhile before we have any idea of the full details surrounding this latest crash. However, while You You initially appeared to blame Autopilot, he later blamed himself. Now, he’s trying to figure out if it may have been some failure related to the suspension.

As you can see by following the story links to the Tesla Model 3 Road Trip Facebook page (above), he’s taken the car off-roading in Europe and on some rough roads. On one of the rough-road excursions, he noted that it may lead to large consequences. You You also reported clicking and clacking sounds coming from the suspension on May 20, but never had the car inspected or fixed.

Again, until there’s some investigation, we won’t be able to form an opinion here. Nonetheless, as has been pointed out on numerous occasions, Autopilot is a hands-on system and the driver is expected to pay attention. In addition to this, the Model 3 is not an off-road vehicle, and if you hear irregular sounds coming from any car’s suspension, it’s critical to get them looked at and taken care of promptly.

Related – We Hitched A Ride With The Tesla Model 3 Road Trip

Driver You You Xue, whom we caught up with at an earlier date, has released this statement on the matter:

Statement

Re: Model 3 crash on Autopilot

Statement regarding collision

26 May 2018

FLORINA, GREECE

Thank you everyone for your kind wishes and messages of support following the collision late yesterday night. This is an absolutely devastating loss for me and brings a great journey to a sudden end.

I was driving southbound on highway E65 near the city of Florina, Greece. I was headed towards Kozani, Greece, where I planned to charge and spend the night. At this time, I was not tired after having 8 hours of sleep the previous night. I engaged Autopilot upon entering the highway after crossing the border between Macedonia (FYROM) and Greece. My Autopilot maximum speed was set at approximately 120 km/h, the speed limit for this highway. The highway was well-marked, well-maintained, and well-lit. The conditions were dry, and there was no traffic around me. The highway was two lanes in each direction, separated by a concrete median. The highway in my direction of travel divided at a fork, with the #2 right lane changing into the exit lane, and the #1 left lane remaining the lane for thru traffic. I was travelling in the #1 lane.

My left hand was grasping the bottom of the steering wheel during the drive, my right hand was resting on my lap. The vehicle showed no signs of difficulty following the road up until this fork. As the gore point began, approximately 8m before the crash barrier and end of the fork, my Model 3 veered suddenly and with great force to the right. I was taking a glance at the navigation on my phone, and was not paying full attention to the road. I was startled by the sudden change in direction of the car, and I attempted to apply additional grip onto the steering wheel in an attempt to correct the steering. This input was too late and although I was only a few inches from clearing the crash barrier, the front left of the vehicle near the wheel well crashed into the right edge of the barrier, resulting in severe damage.

I was not harmed in the collision, and no medical attention has been sought. I was wearing my seatbelt before and during the collision. None of the airbags deployed.

My Model 3 is not drivable as the front left wheel is completely shattered, and the axle is out of alignment. The damage is severe on the left of the front bumper, running to the front lip of the driver’s door, and is moderately severe from there to the left of the back bumper. The vehicle has been towed to a shop, and at 09:00 today, I will accompany the vehicle on a tow truck to Thessaloniki. I am towing the car there under recommendation from locals as more resources are available there, including resources to repatriate the vehicle back to the United States. I will make a decision soon as to whether or not it makes sense to bring this car back to the United States in an attempt to fix it, as the cost to repair the vehicle may substantially exceed its value after repair or salvage value.

Tesla states in an on-screen warning that both hands should be on the wheel when Autopilot is activated. Furthermore, Tesla states that drivers should be paying attention and monitoring the performance of Autopilot at all times. It is likely true that if all drivers obeyed the warnings surrounding this software, that most of the collisions we hear about in the press would never happen. Autopilot has limitations that currently can only be overcome through human intervention. For example, it cannot detect stationary objects, which explains collisions where Model S has rear-ended stationary vehicles parked on the side of the road.

By looking at my navigation and by not having both hands on the wheel, I was not paying full attention to the road while the vehicle was in Autopilot and was not following Tesla’s directions in regards to the correct use of the software. I want to make it clear that I take responsibility in regards to my actions. With that being said, I do not believe that there are many Tesla owners who, when using Autopilot, always keep both hands on the wheel and provide their undivided attention to monitoring the road and the software. This collision was directly caused by the Autopilot software seriously malfunctioning and misinterpreting the road. This collision could have happened to anyone who does not expect a car travelling at a fast speed in a straight line to suddenly and without warning, veer off course. After tens of thousands of kilometres worth of Autopilot driving without major incidents, I have learned to trust the software. Autopilot provides users with a strong sense of security and reliability as it takes you to your destination and navigates traffic on your behalf. Clearly, I had become too trusting of the software.

Autopilot is marketed as a driver assistance feature that reduces stress and improves safety. However, the vigilance required to use the software, such as keeping both hands on the wheel and constantly monitoring the system for malfunctions or abnormal behaviour, arguably requires significantly more attention than just driving the vehicle normally without use of Autopilot. Furthermore, I believe that if Autopilot even has the small potential of misreading a clearly marked gore point, and has the potential to drive into a gore point and crash into a barrier, it should not be tested in beta, on the open road, and by normal consumers. My experience is not unique as many drivers have reported similar behaviour from Autopilot, and a fatal crash involving Autopilot on a Model X may have been caused by a disturbingly similar malfunction.

Many Tesla fans will likely dismiss this as fully my fault, but I implore those who believe so to take a full step back and put themselves in my shoes, as a driver who had used this amazing software for so long, and who could not have anticipated such a sudden and violent jerk of the wheel to one direction while travelling at a fast speed. I hope that my fellow owners will be less dismissive of various incidents regarding Autopilot, and understand that the general public views these severe collisions differently from the owner community. Tesla is moving quickly into the mass market, and potential customers in that segment aren’t going to ask, “why were both of his hands not on the wheel while the car was in Autopilot?”, rather, they are going to ask “why did the car swerve into the gore point without warning?”. The autonomous driving movement as well as the Tesla community can only get stronger when we tackle these questions and resolve the issues behind them.

I love my Teslas and I have owned a Tesla since 2014. I am upset with myself for being part of the growing list of individuals who have been involved in collisions while their Tesla was on Autopilot. I strongly believe in the capability of self-driving vehicles to not only eliminate all collisions on the road but to revolutionise our society. However, malfunctions like this greatly reduce the public’s confidence in a technology that should indeed be tested and rolled out to the public as soon as it is safe for use. I do not want to cause Tesla damage to its brand or image as I wholly support its mission and I am a big supporter. I only hope that Tesla will investigate this incident to determine what went wrong with the software, and make improvements that will enhance other people’s experiences with the car.

I am very grateful to be alive after what could have easily been a fatal collision. This incident was not more severe thanks to an excellent crash barrier on the Greek highway. I want to thank everyone again for your messages of support. I am honoured to have had this opportunity to spread the EV movement not only around my country but around the world. In closing, I want to address some of my critics who have used this collision to laugh at me or to otherwise make fun of this incident. On this road trip, there have indeed been crazy posts where I push the limit of my car and I have only done these things to share my excitement about my Model 3 with others. Please understand that there was nothing out of the ordinary occurring before this collision. I was not sleeping at the wheel, I was not tired, I was not eating at the wheel (which by the way, I have not done before), no videos were being filmed – the vehicle was being operated normally. I am truly sorry and deeply regret that some of my actions have caused a bad taste in people’s mouths, I ask those people to judge my road trip thus far as a whole and not by my craziest or worst moments. I have met with over 8000 people on the road in three continents and 25 countries, and have demonstrated that not only is it possible to drive an EV across the world, it is absolutely exhilarating and brings along great adventure along the way. I have also seen the potential and power of the EV owners community, which when leveraged, can make a great difference in our world. I ask those who disapprove of my actions to reconsider their stance, and I want to see what positive things can come of this collision.

What happens next on this road trip is uncertain. I will keep everyone posted.

Tesla too issued a statement. It’s posted below:

“While we appreciate You You Xue’s effort to spread the word about Model 3, he was informed that Tesla does not yet have a presence in Eastern Europe and that there is no connectivity or service available for vehicles there. In addition, Model 3 has not yet been approved and homologated for driving outside of the U.S. and Canada. Although we haven’t been able to retrieve any data from the vehicle given that the accident occurred in an unsupported area, Tesla has always been clear that the driver must remain responsible for the car at all times when using Autopilot. We’re sorry to hear that this accident occurred, and we’re glad You You is safe.”

Source: Tesla Model 3 Road Trip, Electrek

Categories: Crashed EVs, Tesla

Tags: , , , ,

Leave a Reply

54 Comments on "Tesla Model 3 Road Trip Car Hits Barrier While Reportedly On Autopilot"

newest oldest most voted

I Hope he can successfully finish his trip, if or when his historic TM3 is repaired or replaced.

What a bummer, Thats a big write off for You You…

Just a few days ago he was complaining that there was “cracking and clacking sounds every time the steering is turned even 10 degrees to one side” but he did not have it looked at. That car also had front suspension damage in the past when two wheels were destroyed in an incident when it was being demonstrated.

I’m more put out by You You’s word choice. The car either “veered”, or had “such a sudden and violent jerk of the wheel”.

He lost me with that last line, as I believe auto-steering torque is always limited and never “sudden”. It doesn’t “jerk” the car. Ever. Pretty sure there’s even a maximum torque spec the steering motor is limited to. Put another way, there’s always time unless someone uses poor judgement and lets AP take them through exceedingly narrow paths (like construction barriers).

So, yes, I would suspect suspension, if anything at all “sudden” and “violent” was a part of this wreck. Also, If you let autopilot drive when there’s no margin of error, you have less time to react to a “veer”. So, don’t use it @60mph, when ~1.5 feet from objects, barriers, etc.

(fixed the email) I was giving someone an AP demonstration the other day, and found myself explaining among these accidents that Tesla is putting perhaps too much judgement in its customers hands.

Any highly experienced driver would understand Tesla’s Modus Operandi, or “MO”, if they knew on one hand the company won’t let its drivers go anywhere near a power-slide (TC-off), or past even the slightest point of wheel slip, but will let them choose whether to have the software thread progressively narrower needles, at higher and higher speeds. Does anyone have a good idea what a good margin is, for the amount of maximum torque they can expect from AP?

This is a true safety dichotomy and shows, at least me anyway, the lengths this company is willing to go to define how it thinks cars should be used. They knew a massive safety phase was coming, and have a pretty good fence up for those who aren’t figuring things out (assuming it wasn’t a suspension failure, or mysterious high-torque steering motor in You You’s TM3).

It may have been a radical change in surface and maybe just basic radial pull from the tire.

Especially if the new tires were mounted in the front instead of rotating them to the rear and mounting the rears on the front.

This knee jerk reaction by the Tesla community to blame the driver in similar circumstances to this one is disturbing to me. I will add that I myself am a huge fan of Tesla and have been since 2006. Failing to address the fact that autopilot is THE cause of this accident is a mistake and is actually not helpful to Tesla in the long run. If you don’t believe that autopilot caused this accident then ask yourself if it would have happened if autopilot had been disengaged while the driver was momentarily distracted. I understand that autopilot saves the driver from collisions in many other circumstances where the driver is distracted but I’m talking about this particular instance. I also agree with the driver in saying that Tesla should stop simultaneously stating that Autopilot reduces driver fatigue while requiring that the driver constantly monitor every autopilot action up to and including correcting a sudden and catastrophic swerve into a barrier. That to me is the opposite of relaxing, Just think about the mentality you would have to maintain. I better closely monitor my autopilot because eventually if I drive long enough it could suddenly decide to send me into… Read more »

True, but if it was a suspension breakage due to the fact that You You has publicly shared his abuse of the car on numerous occasions, including off-roading and speaking to suspension issues and noises the day before this happened, it leaves a lot up in the air for speculation. With most of these recent crashes, it’s still all hearsay and we won’t have facts from investigations for a long time. We almost hesitate to publish the stories and have avoided the last few, since we just don’t have the facts. Be prepared for another huge wave of speculation all over social media. Until there are concrete details on all of these situations, it’s very hard to just report fault as if we really know what happened for sure. Thanks for your comment.

If it was a suspension issue, then it was quite a coincidence to happen right there at a fork in the road.

At this point, I’m inclined to believe the autopilot was at fault until proven otherwise.

With so much attention to each crash, it’s hard to get a grip on the scale of the problem. There are now hundreds of thousands of Tesla’s on the road and each one of them is being watched closely to report any problem. The hundreds millions of other cars can crash at will and stay completely under the radar.

Agreed.

Just saw his “off-road” video and it’s quite tame (really just a very poor road and he’s not driving fast) so if that caused a suspension failure, then the model 3 has even bigger design issues. Having said that, it’s usually the simplest reason and with several examples of auto-pilot having issues with exits, I’m inclined to go with that.

BTW, after some sleep youyou is saying in his latest facebook entry that, “Autopilot caused and was the immediate cause of the collision, there is zero doubt about that. Please refer to my statement.”

He had an accident that required two new wheels and parts to be replaced in the front end before that.

Ahh. Did Tesla do the repairs or 3rd party? My search kung-fu is weak.

It was a little from column A and a little from column B. Tesla replace some of the parts and a tire and wheel store did some of the others.

Found this entry from May 20th:

“A TON of noise now coming from the suspension, cracking and clacking sounds every time the steering is turned even 10 degrees to one side. But, no support from service centres so thoughts and prayers only for now. Hope the car doesn’t fall apart!”

Based on that post it seems the car was not safe to drive and YouYou chose to drive it anyway.

Waiting for Elon to cry about the “holier than thou” media now going after those ad revenue clicks. Lol

Remember, the offroading video is just what was recorded and what You You chose to uploade for others to see. We don’t know how much other offroading or how fast he was traveling when doing so, and what happend during the misuse of the vehicle. But what we do know is that after offroaded he did notice damage to the suspension before traveling the highway with out having it inspected before hand. Also, remember that EVERY automakers make vehicles for different markets with different specifications. Meaning that a US bound sedan with more even roadways would not be made to the same ‘durability’ specification as another country with much rougher roads. You You took a US designed vehicle and drove it extensively on roadways where it was not intended to be driven. This should also invalidate his warrenty as well. ___________ From the Model 3 Limited Warranty “This New Vehicle Limited Warranty does not cover any vehicle damage or malfunction directly or indirectly caused by, due to or resulting from normal wear or deterioration, abuse, misuse, negligence, accident, improper maintenance, operation, storage or transport, including, but not limited to, any of the following: • Failure to take the vehicle to,… Read more »

There are plenty of areas in the US with absolutely terrible roads. I highly doubt Model 3s destined for Europe are built to some higher durability standard.

I don’t know if that’s boilerplate language that’s in every car warranty, but just about every car I’ve ever owned was driven at some point on a dirt road (not commonly but sometimes it’s necessary). I would not expect my warranty to be voided just because I drove once or twice on a dirt road. If the road had some huge hole in it and the car got specifically damaged from that I wouldn’t expect the warranty to apply, of course.

Driving few hundred meters on gravel road is not exactly “off-roading”. You can get any 15,000 euro Yaris or Polo and it will cope just fine for tens of thousands of kilometers of gravel without falling apart.

Driving with suspension noises is just stupid though. Locked bearing or failed ball joint may send you off road, and it can be easily diagnosed by any mechanic without Tesla specific experience. Relying on autopilot is another stupidity, but that is what this cult is all about after all. It crashes and burns its disciples all the time, but they will continue their heroic effort and sacrifice themselves to the Gods of Autopilot and the Bright Future as told as by Saint Elon. Amen.

+1

Congrats on the Tesla, Joe. Before “blame autopilot”, I am hopeful you might have a different take, as miles go by and your own judgement with the system forms. Assuming you have AP, you might appreciate just how predictable (good and bad) AP can be. I’ve had it from the beginning, and off-ramps are an issue. Updates, and miles of data, made that better, but know that your car’s cameras are constantly looking for painted lines. If you see them fading, *or dividing*, expect that without painted lines your car’s auto-steering may get confused. Time for vigilance. It’s really not that hard to predict, but sad people aren’t thinking about the raw, limited abilities of a camera(s), a radar unit, and what little the parking sensors actually do. The updates, and “geo” data, frankly, scare me a bit as my understanding is Tesla’s data-base effectively knows historic car paths, and to some extent may be “re-painting” lines beneath snow/rain (not sure how crazy that last part goes, but point is that extra variable inputs don’t inspire predictability. -Just prediction). AP pays for itself in traffic congestion, and Tesla has the greatest system, full-stop, for this application. Go beyond that, at… Read more »

Audi’s new Traffic Jam Pilot sounds better for real congestion, i.e. below 60 km/hr. Coming in the 2019 A8, except in the US where it’s disabled due to lack of coherent regulation.

Traffic Jam Pilot is the first Level 3 system on a consumer car and first to use LIDAR. We’ll see how well it does in the real world this fall.

Personally I agree with you that the current “autopilot” implementation is not worth a damn. It doesn’t matter how well the system works unless it’s good enough for fully autonomous use. Having to try to maintain full concentration for the .01% of the time the system fails is incredibly difficult. The fact the system works nearly all of the time lulls people into complacency and boredom, making it impossible to actually maintain full awareness. It’s overall a bad combination.

It’s kind of like the “uncanny valley” of driving assistance, too good but not good enough.

For the few times I’ve tried it, Autopilot does provide a false sense of security. It benefits the driver with the feeling of autonomy and as a result detaches the driver from the road. Only saving grace is that it’ll improve over time or is further limited in the short term to maintain driver awareness.

Yes, many people have said that and You You talks to that in detail. However, if the car veered because he destroyed the suspension, then this is actually a whole different story. Sadly, we don’t and won’t really know for some time.

While possible the suspension coincidentally failed at exactly the point where the lanes forked, it is much more likely Autopilot became confused by the diverging lanes. A damaged suspension may have contributed to the inability to correct, but it is very unlikely to be the primary cause.

It may be both. AP could have attempted a “quick” maneuver which could have caused an already damaged part to fail.

I’m sorry this happened to you, You You. I had already been convinced by recent events that Autopilot was not performing adequately. This just confirms that impression.

Not clear what he’s saying.
Is he saying that just before the barrier Autopilot decided to take the exit, and went from the left lane to the right, and hit the barrier?

Sounds more like a very sudden veer. If Autopilot was making an attempt to change lanes, it may or may not have been so aggressive like he explains. That’s why he’s thinking it may have been a suspension breakage at this point, rather than Autopilot just suddenly jerking the wheel. If that’s the case, whether he was in control or Autopilot was wouldn’t have much bearing. But again, as usual, this is all speculation and hearsay, as we really know nothing for sure. That’s the problem with these crashes.

Well said.

And negative people thumbs down you. Good Lord !

“Tesla does not yet have a presence in Eastern Europe”
Well, what are they waiting for? 🙂

They are waiting for Eastern Europeans to get richer to afford Tesla cars! 😉

Did anyone expect anything else? Next

What, another AP-induced accident? Totally expected it. Next one unfortunately will happen sooner than later till someone does something about AP.

Doesn’t see stationary objects…

Good Lord !

You want the car to panic brake every time it sees a rest stop sign on the side of the road, or overhead freeway sign at the top or bottom of a hill?

Nope but I do want it to see a Guard rail or school bus

1. Car companies have turned AV into an arms race, leading to some dodgy claims by the manufacturers.

2. People without sufficient technical background will read the claims [1] and grossly overestimate what the cars can do today.

3. [1] and [2] will combine to cause an ongoing stream of these incidents, some with horrific human cost, until the technology advances greatly beyond where it is today and/or consumers become far more intelligent about their use of smart cars.

4. Improving the technology until it can do what unsophisticated drivers think it can already do is an exceedingly daunting task. Without heavily instrumenting roads, it will require true AI-level code in every car. Don’t expect that to happen any time soon. I speak as a very longtime programmer who is terrified of the ability of cars to self-navigate in real world conditions that I personally encounter very often.

5. My guess is that we’ll see mixed-mode AV use for a long time, as in: You get in your car in your garage, you drive it to a local, specially designated and instrumented highway, and turn over control to the AP function. When you approach your exit you re-take control.

Any human made system, which assumes we are rational actors, is certain to crash. 🙂

Here we go, let the naysayers and haters begin, again..

There goes my test drive 🙁 too bad You You never made it to Vienna, Austria…

Good luck getting your car repaired. Looks like this car will need the same driver’s side rocker panel molding piece I’ve been waiting 6 weeks for Tesla to deliver.

I think it would drive fine without the molding…. He just needs enough parts to get it to drive, and rough out the rest. Then fix it in the US. If it is worth fixing.

Is the yellow color a tape job?

It’s simply bad lighting

“As you can see by following the story links to the Tesla Model 3 Road Trip Facebook page (above), he’s taken the car off-roading in Europe and on some rough roads. On one of the rough-road excursions, he noted that it may lead to large consequences. You You also reported clicking and clacking sounds coming from the suspension on May 20, but never had the car inspected or fixed.” It sounds like misuse of the vehicle(offroading), actually recorded on video, causing owner noticable damage, and not having it inspected or repaired, is the cause of a failue on the left tire or axle. AutoPilot does not ‘jerk’ the steering wheel with ‘effort’ from a driver’s hands, since the driver can always have control over the vehicle. Even without a Tesla auto service center in the area, the car could have been looked at by any competent mechanic to inspect the suspension before traveling at highway speeds since he ‘knew’ that he damage/broke something and had a problem. He did take responsibility and rightfully so by admitting he was not paying attention, but lost credibility through the monologue about his personal feelings about AutoPilot, in a an attempt to put blame… Read more »

I feel sorry for him. The car looks near to a total damage and it may be complicated to get it repaired in Greece. Tesla does not sell Model 3 yet in Europe, so it could be hard to get spare parts and repair services

Parts would have to be shipped from the US, something that exotic American made car owners are used to doing in Europe. Like someone who has imported a classic US muscle car or truck. The same as people who import exotic European cars to the US are also used to doing with ordering parts from Europe.

No… do NOT write that Autopilot could the culprit here.

The driver is taking way too many risks and basically trying to bring Tesla a bad reputation.

1) falling asleep at the wheel multiple times?
2) driving at night with headlights turned off to see what Autopilot will do?
3) driving at 130mph at night, around other cars on the road, while holding cellphone in one hand to film it?
4) driving with suspension/steering issues already noted?
5) driving in an area of Eastern Europe that Tesla already told him directly “do not use Autopilot there” ? (with Autopilot)
6) driving without proper insurance just to be able to do a publicity drive, for which he is receiving no pay, and has not been asked by Tesla to carry out?

He is obviously off his rocker. No normal person has the money or time to do this. He is some sort of wealthy person who thinks (perhaps) he is doing a good thing, when in fact, he is bringing ill-repute to Tesla. Needs to STOP RIGHT NOW.

Is this true – “Autopilot has limitations that currently can only be overcome through human intervention. For example, it cannot detect stationary objects,”?

That seems ridiculous to me. It should be one of the first thing any autonomous driving should do… Not run into objects in front of the vehicle.

It isn’t autonomous.

Ho Hum, Just heard on Automotive News another Tesla with AP on t-boned a police suv. These are happening with such regularity they’re not really a news item any longer – but the other point is, when will Tesla fix this? Seems like these cars are Dangerous. I sure wouldn’t want one behind me.