Watch Tesla Autopilot Cross Double Yellow, Drive Wrong Way


With Tesla Autopilot, it still seems to be a mix of the good and the bad.

This is to be expected since Telsa continues to assert that its semi-autonomous self-driving system is a “hands-on” technology. CEO Elon Musk recently admitted during his interview on CBS this morning that Autopilot will never be perfect while he proceeded to drive hands-free. But, he remained in control and aware of the car and his surroundngs, and he didn’t take a rest in the passenger seat.

Watch This: Tesla Autopilot Deals With Merging Semi With Ease

Every time we share one of these videos, the opinions on both sides are rampant.

This comes as no surprise to us since some Autopilot footage shows strange behavior that may be able to be attributed to something to do with the road, the lines, the blinding sunlight, the actions of other drivers, etc. Other times, it just seems “off.” Then, there are times that it appears to perform some miraculous save. The bottom line is that these systems are just not consistent … yet.

Yes, Autopilot still has strange quirks and should not be trusted. This is why you should always pay attention when using this new technology.

Yes, Autopilot sometimes gives warnings or makes corrections that could save lives, however, there’s no way for us to know for sure what may have happened if the system didn’t warn or engage, and we can’t just hope it will work in our favor every time.

With any new technology, you can’t really assume anything. Much of getting used to what it will do and how it works is through trial and error. Autopilot is not something we want to become familiar with through trial and error unless we are diligent throughout the process.

An error on your new mobile phone or computer may be irritating, but you’ll get through it. An error in a car with a self-driving system can lead to severe consequences. Needless to say, as Autopilot and other semi-autonomous technologies improve, there’s really no doubt that they should save lives. However, there will always be exceptions.

Exceptions like the one shown in the above video can lead to devastating situations. Nonetheless, one of the first questions asked will be whether or not the driver was paying attention and allowed the system to perform such an unsafe maneuver. It surely is a difficult situation as companies work to push this type of tech into the mainstream.

Categories: Tesla, Videos

Tags: , ,

Leave a Reply

103 Comments on "Watch Tesla Autopilot Cross Double Yellow, Drive Wrong Way"

newest oldest most voted

Looks good. Ready for production.


A lot better than anything else, and useful if you understand hill crests take painted lines out of view. It’s a predictable outcome. I’m becoming polar on this stuff. As a driver’s assist feature, Tesla is unrivaled. I’d rather they leave it, than kill it, or kill Level 3. These arguments are heating up, against Tesla. As “Full Self Drive”, Level 5, or “autonomous”, or as named “Autopilot”, marketed as “40% safer” than a human (cough, Tesla customer), it’s poor and will be for at least another decade. 😉 People have little respect for how they, themselves can wayyyyy outdrive these systems. No, most can’t out-brake ABS, or perhaps a good AEB system, but they can read the road much better than Silicone Valley. If everyone around you follows, or cuts off, at >3 spaces, why is Tesla leaving 4 at a minimum, these days? That takes up more road space, slows traffic, adds to congestion, not to mention the “accordion effect”, all because of a programmer’s definition of “safe distance”. Tesla’s AP turns your dash into a cartoon, even when you aren’t using it. The cars are incapable of eye contact, they don’t have a clue about blind spots… Read more »

I don’t think Autopilot is meant for really curvy roads though the hills. Think mind-numbing freeway driving.

Yeah but it works a bit, so why not do it?
If Elon didn’t want people use it on curvy roads through the hills, he would disable it there.

If you call crossing a double yellow center lane and driving on the wrong side of the road “working a bit”, I have some oceanfront property in North Dakota to sell you.

It can happen in the dark. Apart from this, it seems to work a bit on the other parts.

Agree. That is NOT a badly marked road. Those lines look new.


No Problem! But you’re advertising such reality before the rivers & lakes rise enough to connect with the Ocean! Just a Tad early!

Same as thinking because it is called “Autopilot”, that it is at a “Got This stage, just yet!

Or would put disclaimers in the manual,…oh wait,…

Autopilot limitations
Traffic-Aware Cruise Control is a beta feature.
Traffic-Aware Cruise Control is primarily intended for driving on dry, straight roads, such as highways and freeways. It should not be used on city streets.

Warning: Do not use Traffic-Aware Cruise Control on city streets or on roads where traffic conditions are constantly changing.

Warning: Do not use Traffic-Aware Cruise Control on winding roads with sharp curves, on icy or slippery road surfaces, or when weather conditions (such as heavy rain, snow, fog, etc.) make it inappropriate to drive at a consistent speed.

Auto steer is a beta product.

Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver. When using Autosteer, hold the steering wheel and be mindful of road conditions and surrounding tra c. Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present.

Or how about:

“The AP driving feature is meant for entertainment purposes only. Any damage or injury to persons or property is the responsibility of the user”.

“If Elon didn’t want people use it on curvy roads through the hills, he would disable it there.”

Using, of course, “curvy road detection” ™.

Elon’s Curvey Detection is still “a work in progress”, and since some think Cadillac’s system is better, to be fair, let’s run the Cadillac over this same road, and observe how its control system handles this?!

It will be disabled, you can’t activate it as it is only meant for highways etc.
This is how you can ensure safety, instead of writing somewhere please only use it on divded roads…
Seems like tesla wants people to mess around with it.

Yes, exactly.

It’s rather depressing to see so many people apparently confusing “This car is designed to follow lane markings where they are clear and the road isn’t too curvy” with “This car knows exactly what kind of road it’s on at all times, what the exact road layout is at every point, and where all stationary (or temporarily parked) obstacles are located in or next to the lanes.”

Tesla should disable AP on these types of roads then, don’t you think? A responsible company would anyways. Or is geofencing too complicated for Tesla? Maybe Elon can phone Mary and ask her how it’s done properly.

Maybe all the major automakers should disable ACC since I can’t safely turn it on and put my feet up on the dash all the way to work. ACC is especially not safe on curvy surface streets where the car bumper in front of you isn’t directly in front of you due to the curve. You will slam into them every time there is a sudden stop. I wonder why all these major automakers don’t geofence ACC in these locations like a responsible company would? Is it too complicated for them?

I don’t allow ANYONE to put their feet on my dash (unless I happen to be sleeping with them).

“Tesla should disable AP on these types of roads then, don’t you think? A responsible company would anyways.”

No. It would be just as foolish to argue that we should all shut off our cars’ air bags because about 20 fatal accidents are related to exploding air bags.

We are all still far safer with the air bags on than off, even though there is a very rare malfunction. And statistically speaking, we already know that Tesla cars are significantly safer with AutoSteer engaged than without it.

Only dedicated Tesla bashers (hi, Bro1999!) and neo-Luddites argue that we should stop using driver assist features which improve safety, simply because they’re not 100% perfect.

Guess what? Self-driving cars will never be 100% perfect! So the sooner people get over their irrational fear of self-driving (or semi-self-driving) cars, the better for them and for everybody else on the road.

lies, damned lies, and statistics.

I’m capable of driving on the right side of the road. So no, AP is not safer than me.

You do realize that when you drive with autopilot turned on, that you are still behind the wheel, right? So if you are capable of driving on the right side of the road without autopilot, then you still will be capable of driving on the right side of the road with autopilot too.

But that AP may very well catch something in the road that you don’t catch, and save your life regardless of your capabilities to drive on the right side of the road.

Please don’t mix up AP with dynamic safety features.

If you use the tech properly it would be as safe as you plus some.

I think it’s up to the driver to determine when they should use it. For example, most/all cars have cruise control, but should you use it everywhere and in all situations? Probably not. And should it be up the manufacturer to enable/disable it at the appropriate times? No, I think that is asking too much. “Responsible use” is the phrase that comes to mind.

I can set my Jaguar drive assistance in curvy roads and for sure I will be out the route in 1 minute, sure Jaguar should remove this assistance for fool people.

“I don’t think Autopilot is meant for really curvy roads though the hills. Think mind-numbing freeway driving.”

Fine, then you hit that patch of freeway where the lines are not clearly marked, and the car swerves into other traffic.

Or a concrete barrier.

Autopilot isn’t meant for non-highway roads. It’s not even meant to be used for on and off ramp (yet).

Sure I can warm up my shoes in the kitchen oven but I’d be asking for trouble. Likewise, if you use Autopilot on roads not designed for it…it won’t stop you but you’ll still be asking for trouble.

When I was a kid my dad would dry his wet magazines from the mailbox or his damp clothes in the microwave.

Tried that. Dosen’t work. Use the “no spin” option on a clothes dryer, that works like charm.

“Autopilot isn’t meant for non-highway roads.”

Some of Tesla’s disclaimers in part do still read that way, but that is not what Tesla has been advertising for some time now. Tesla put out an AutoSteer demo video in Nov. 2016 that shows hands-free driving on city streets and country roads with two-way traffic.

That was demoing a prototype version of FSD. NOT autopilot.

Thank you for pointing that out. Mea culpa; obviously I didn’t examine my own citation carefully enough!

Point is, it should not allow you to engage it on roads it isn’t ready for. Just like what GM is doing with supercruise. It can only be engaged on geofenced mapped roads.

Why haven’t you (and others making the same argument) lambasted automakers making hundreds of models capable of driving well over 100mph without geofencing to autobahns and race tracks?

Survivable accidents become deadly at those speeds, even though most of the time you could drive that fast without incident. Just like Autopilot works most of the time.

This line of thinking is just nonsense. If we can’t trust people to use their cars safely, then the only solution is to make all cars illegal.

Thank you! The idea that any form of driver assist feature removes responsibility for safe driving from the driver, is nonsense. And unfortunately, it seems to be a symptom of how our culture is shifting from demanding personal responsibility from adults, to a culture in which people see themselves as helpless victims with no responsibility for their actions.

Personally, I don’t want to be treated as a child. Too bad that in today’s world, many people apparently want to be treated that way. 🙁

Blame falls on the user. Everyone wants to throw the responsibility somewhere else. The bottom line is as the driver, you are always the one in control of the car. Assistance features on or not.

If only gun manufactures felt the same way you do, not allowing them to be used in locations that they aren’t ready for!

This is not good.
Not a very safe Autopilot. Many people will get kiIIed with this.
I will wait for Mercedes EV, I have more trust in them.

Many, many more people will have their lives saved by Tesla Autopilot+AutoSteer than will lose their lives because of it. In fact, that is already happening today!

Anyone who thinks this system is ready for Prime Time is someone who stays in Mommie’s basement all the time and is not legally allowed to drive any longer.

Yesterday, I forget who it was, but some safety regulator from the Federal Gov’t called MUSK and he hanged up on him.

Does Musk now think he is more important than a representative of the Federal Gov’t?

I wonder if MUSK now hangs up when an IRS official calls him? I would doubt it, seeing as MUSK has too many Mansions in the States that are subject to IRS Seizure.

Why was Robert Sumwalt, the chairman of the National Transportation Safety Board, disclosing the contents of a call regarding an open investigation with the “attendees of the International Society of Air Safety Investigators’ Mid-Atlantic Regional Chapter dinner “????????????????

So Sumwalt attacks Elon for publicly releasing actual relevant safety concerns about Tesla’s products, reminding users to follow the rules and pay attention or risk collision just like in cars without autopilot who fail to pay attention. But then he turns around and publicly discusses private phone calls in an open public forum to a bunch of AIR Safety investigators, who would have no reason to hear about an AUTOMOTIVE investigation??!?!?!??

Sumwalt just sunk his last vestige of credibility.

Still has 100x more credibility than your cult leader Elon.

Every Tesla wrecked is another Tesla sold.

Plus, more parts for D-I-Y EV Conversion builds!

Hmmm, I think those down-voting this comment need to reconsider. If it’s expected that everyone wrecking their Tesla will replace it with another Tesla car, rather than buying a different make of car, then that’s a pretty strong recommendation for Tesla cars!

“Other times, it just seems “off.” Then, there are times that it appears to perform some miraculous save. The bottom line is that these systems are just not consistent … yet.”

That inconsistency is the problem. When it is inconsistent, it will be hard for drivers to establish a level of confidence that will allow them to be fully relaxed or give up certain level of controls. That makes the system less trustworthy to use.

In my opinion, I would completely leave off that option until it is ready. And I am okay with pay the $1K difference in the future if it will take few years before it is ready. The return on that $5K in 5 years should easily cover the difference. Or by then, either it will be better or competition will be better…

This system is definitely not consistent when you use it on roads where it isn’t supposed to be used. Tesla tells customers this isn’t to be used on non-divided curvy surface roads like this. It’s also not designed to climb out of the drivers seat in any situation and put your feet up on the dash. Just because someone makes a video of autopilot failing in these situations where it’s never supposed to be used in the first place doesn’t mean autopilot is a failure. It’s not perfect on freeways either but much more consistent. That is where it is supposed to be used.

I could make endless videos of when my non-Tesla ACC failed due to a car quickly cutting in too close in front of me and my car not picking it up in time. Luckily though I’m aware that it isn’t perfect and my foot is still ready to hit the brake.

“it will be hard for drivers to establish a level of confidence that will allow them to be fully relaxed or give up certain level of controls.”


You should NEVER “give up” ANY “level of controls” with this level of driver assist!!! YOU ARE ALWAYS THE ONE IN CONTROL OF YOUR VEHICLE, whether you have autopilot or not. Just like in the FAA instructions to airplane pilots, you can never “give up” control of the vehicle. You are always responsible. If you want to give up control, wait for fully autonomous cars.

(⌐■_■) Trollnonymous

I concur!

Someone post a Willy Wonka meme with “Tell me how AP is safer than no AP again”. Lol

It is when used responsibly and correctly. I would bet everything I own that steering a car while sitting in the passenger seat using ACC is less safe than using it correctly. So is ACC safer than no ACC? The answer is, depends how you use it.

Totally agree with you there. When used responsibly and correctly, a driver will be ready to take the wheel in situations like this, and accidents will be reduced overall since there’s no longer a single point of failure (car covers for when driver would have screwed up, driver covers for when car would have screwed up). The problem is that, the better the ACC, the easier it is for humans to get distracted, to trust the machine more than it deserves to be trusted. Waymo discovered this all the way back in ~2013, when they had an Autopilot-tier product in closed beta – Chris Urmson likes to tell stories about those days. Even when it was a secret, experimental, very fallible prototype – not even a product on the open market – people still fell into this psychological trap, this false sense of security. As something that anyone can buy, that tendency would only get worse. So… if you have instructions that make a thing safe to use, but you also know pragmatically that a big chunk of people *won’t* follow those instructions, what do you do? Do you charge ahead like the aforementioned Willy Wonka and let Violet… Read more »

Talk about the Bolt. At least it’s something you know something about, as you have first hand experience with it. How’s it driving etc…Does it have any driver assist features that you use?

There is no place compared to pure ota updates, take the time, and you’ll see what it’s really like to be, (in the future.) Not a world of pure imagination.

Go join a local fire dept and drink in the endless accidents caused by endless human error: drunk driving, texting and driving, fatigue and driving, eating while driving, dementia and driving, etc. Human error is very measurable. Accident prevention is very difficult to quantify because nothing of note happens or is recorded. We’re so numb to the daily accidents and carnage caused by human failures that all we notice are the exceptions to the norm, which would be the times when automation fails to perform.

Think about that for a bit, and go do some fire dept ride-a-longs if you truly want to expand your awareness.

Let’s see; the NHTSA says that Tesla cars with Autopilot+AutoSteer installed have ~40% fewer accidents than those without. Contrariwise, serial Tesla basher MadBro insinuates AutoSteer makes Tesla cars more dangerous.

Gosh, who to believe? Who to believe? 🙄

For anyone who has actually read the NHTSA report in detail, they would have known that statistic was generated by comparing the early period Model S/X crash data with later period when the Autopilot+Autosteer was available. The two sample size were different. It could be a “correlation” that as more Model S/X are in the market, the total accident rate dropped overall due to a larger installed base. Of course, we really don’t know for sure either way.

What is a better statistic is that compare directly the accident rate on the same sample size during the same time frame and compare the accident rates among the same population of Model S/X between when autopilot+autopsteer was on vs. they are off. That will give a better indication on whether the feature improve the crash avoidance or not.

Statistics are not available for when AutoSteer was on vs. when it was off during a crash.

What you’re suggesting is that we should ignore the evidence we do have because it’s not a 100% free from any possibility of selection bias. By that criteria, most surveys and scientific experiments should be ignored.

Thankfully, it is often possible to reach the correct conclusion based on incomplete evidence. In fact, if it wasn’t, daily life would be impossible. We very rarely if ever have 100% complete information about anything, yet we have to make decisions based on the evidence we do have, all the time.

You can retire that “40% safer” FUD. NHTSA backed off from that statement.

Critical thinking clearly isn’t your forte. But then, it’s not the forte of whoever wrote that article, either.

Anyone who ignores the fallacious conclusions in the article, and applies a logical analysis to the facts it reports, will realize that those facts indicate AutoSteer is even safer than the ~40% accident reduction suggests!

P.S. — “FUD” is always negative. What you’re talking about couldn’t be FUD even if it was true. But it’s not surprising that a serial FUDster would try to blur the true meaning of “FUD”.


Oh, yeah, Electrek. There’s a reliable source for information. Isn’t Fred like a Musk relative due to marrying his cousin?

Stop thinking of AP as a Driving Instructor, and think of it as a cross between a Wife Nagging you when your driving gets sloppy, and a Teenager, learning to drive!

These underage marriages have GOT to stop.

They really should deactivate it on all curvy or hilly roads.

Especially since the manual says not use on curvy roads, local streets, or where pedestrians may be present.

Tesla should go to a geofenced solution.

Good to go for the UK lol

(⌐■_■) Trollnonymous

I prefer no AP hardware or software. Sell me that version at lower price please.

There are plenty of 2012 – 2014 non-AP cars on Craiglist, Autotrader, and Cargurus as I type this.

What are you waiting for? Oh, and they also come equipped with ‘Meh’ Mode (Actually labeled ‘Chill’), so you won’t have to worry about any unnecessarily enjoyable acceleration.

(⌐■_■) Trollnonymous

“Oh, and they also come equipped with ‘Meh’ Mode (Actually labeled ‘Chill’)”

I like that feature but in a TM3.

You’re a good sport, I like the TM3, too. I think I may pick up a 1st Gen in a couple years as a good 2nd car to replace my Volt.

The car probably went for a UK vacation recently…be patient, it will come around.

I love how the guys starts yelling at his car during his narrative in a loud voice when the car goes over the double yellow line- like it’s an unruly child!! Hahaha!!!

Easy buddy.. it’s gonna be all ok..

You know how you have to pay attention when you are driving your car you own right now? And if you find yourself drifting into another lane on a tight curvy road, how it is your responsibility to correct and steer into the correct lane? That’s still your responsibility.

If you failed to pay attention and drove into the wrong lane in a car without autopilot, nobody would blame the car. Autopilot is an ASSIST, and if you are busy fiddling with you cell phone using it to video record your drive, and you fail to correct when you drift out of your lane, you just broke the law.

What we see here is a distracted driver potentially illegally using a cell phone while driving, who fails to control his vehicle, breaking the law and crossing the yellow line. Then the driver illegally continues on the wrong side of the road, further violating another law. The guy should get 3 tickets. He clearly broke the law,

(⌐■_■) Trollnonymous

He also operated AP in unsupported conditions.

Yet another dumb YouTuber future daisy pusher that didn’t read the manual.

Somewhat OT, but I picked up my Model 3 a couple days ago. I didn’t order Autopilot because I simply didn’t think it was worth $5000. If I was someone who spent a lot of time driving 100 miles/day on the freeway, I’d probably reconsider. Or if the price had been around $1000, but $5000 was too much IMO. Worst case scenario is if I really do want it later, it will cost $6k at that time.

(⌐■_■) Trollnonymous

Smart choice IMHO.
…..and congrats!


Congrats, Kdawg. You still had to pay for the Premium interior, which I bet you won’t regret either. I think Autopilot redeems the most for those facing stop&go, who can spend hour(s) commuting 15-30 miles one way.

Yes, the premium interior is something I probably would have paid for, even if I had waited for the $35K version. I probably didn’t need the $9k long range option, but it is what it is.

Congrats! Super jealous. How are you liking it so far? 😀

So far so good. Still blown away by all the technology. I posted some comparisons w/my Volt over at GMVolt . com. The Volt does a couple things better, like brake blending, but the futuristic tech of the Model 3 is simply amazing. I will be selling my Volt soon (don’t need 2 cars), but it will be sad to see it go. I really enjoyed my Volt for 6 years.

I am not paying $5K for that.

No thanks ‼️

It’s a $5000 toy. And yes, I paid for it.

The most dangerous part of the AP (assuming it improves) is the design concept that it can beep for the driver to take over and then just release control of the car. This is based on a deep misunderstanding of people. The better the AP gets, the more complacent the driver. And thus the more unlikely the driver will be able to take control to avoid an accident after the AP “gives up”. The correct action is to brake to avoid hitting the object it is driving for, and not trying to steer around it. This matches driving class 101: Never drive faster than you are prepared to stop.

I have no problem paying for a Toy.

I have a problem with my safety being in the hands of a system that cannot tell it’s on the completely wrong side of the road when it just made the illogical choice in the first place.

I have a problem with other people paying to be beta testers of a potentially unsafe system when misused, putting my life at risk. I never gave consent for to be a test subject.

The humans driving the other cars on the road are all, every one of them, “potentially unsafe systems” who are putting your life at risk. You likely have gotten used to ignoring that fact, but it’s still a danger every time you venture out on public roads.

Why would you demand perfection from an automated system, when you don’t demand perfection from the human drivers it’s designed to replace?

“The thing to keep in mind is that self-driving cars don’t have to be perfect to change the world. They just have to be better than human beings.” — Deepak Ahuja, CFO of Tesla Inc.

Humans ARE Illogical, by Nature! This was designed by Humans! How is that strange?

It’s good people post these videos, we’ve seen all those great examples where you can sit in the passenger/back seat and let AP take over, now we are seeing great examples of where it can fail. This is where forums are excellent at disseminating information. It’s your choice if you buy a car with autonomous features and/or if you decide to use those features. Having a better understanding of where it can work well and what sort of problems can occur helps to work the systems better. Now the really interesting thing about this video is it shows AP is really just a line tracking system, it doesn’t actually know where it is in the context of those lines. The process is really simple, you are already driving in the lane and you activate AP, it will just keep following the lines and try to keep you centred. As demonstrated, if for some reason it switched to a different lane then it doesn’t actually know it is in a different lane, it just continues to centre between those lines. It actually gives a very clear idea as to the recent Model X crash that resulted in a fire, it jumped… Read more »

“Now the really interesting thing about this video is it shows AP is really just a line tracking system, it doesn’t actually know where it is in the context of those lines.”

You mean AutoSteer, not “AP”, but yeah. That’s a very good way of putting it, thanks! I’m going to archive your comment for future use.

The general public vastly overestimates the ability of semi-self-driving systems such as Tesla Autopilot+AutoSteer.

*begin sarcasm* It is meant to weed out stupid people. Only problem is that some innocents may have to take one for the team. *end sarcasm*

Of course autopilot is not supposed to be used in two ways line, dirt routes ond so on, that is exactly why. This guy didn’t read the Autopilot user guide.

What does,Tesla,say when you report it?
They have a eat to teport,right at that,area.

Tesla specifically says to only use AutoPilot on divided highways. So, yes, it could put you in harm’s way if you don’t follow the directions. This is not news.

“It’s going the wrong way!” Seriously? People like you *really* need banning from public roads. Ap is intended for *highway* use, not a narrow twisty country road! RTFM!

WAIT… we can see that the Youtube video (with channel intro edited onto beginning) was posted May 6th 2018.

But when was this filmed? We don’t hear the date spoken on the video and there is no caption for the date. This video looks like something I saw like a year ago.

Is the version of Autopilot quoted anywhere in the video? If it is an old version, it could be that no Tesla currently drives like this because they are all running something more modern.

The nav maps is definitely the old version… people have been enjoying the new version for a month or more.

Is it possible this is a re-post of an old problem video? One that an opportunistic Youtube channel owner would re-post to simply get views and click revenues?

This is autopilot 2.0 or 2.5 which drives like drunk person left, right, centered, brakes like hell on highway. The first version with mobileye does not doing this things. Mercedes Benz half pilot or new nissan leaf are better than 2.0 or 2.5.

Here’s a suggestion for Tesla, how about just return the $3000 to all Autopilot customers until the technology matures?

Why? They knew the limitations when they bought it. I still use it on well marked interstates.

Apparently this whole pay attention when you’re behind the wheel thing is hard for some people to understand.

It looks like the narrow road is more of the issue. It’s having trouble staying centered even on the straighter areas. Seems be some buffer it wants to keep on each side line which it isn’t able to do.

I’m not even sure why this is even news worthy! Inside EV articles are usually pretty darn good. This is not telling the tassel community anything we don’t already know. Of course this guy apparently doesn’t know and doesn’t deserve his car!
I have issue with the fact that the man says “should’ve known it was doing something illegal.“ The car doesn’t “know“ anything! Secondly, Auto pilot is not meant to be used on a two lane back road. It’s like using a wiffleball bat to hit a baseball. You may be able to make contact with the ball but ultimately it’s going to end up in disaster. People need to realize there’s still some time required to program auto pilot’s sensors and visual detection system algorithms to make it useful in all circumstances. Please use auto pilot responsibly. Otherwise we’re going to get it taken away from all of us!