Tesla Autopilot 8.0 Versus 7.0 On Same Winding Two-Lane Road – Videos

OCT 12 2016 BY MARK KANE 28

Tesla Model S owner and blogger KmanAuto recently released video from an Autopilot/Autosteer test (using firmware 8.xx) on winding two-lane road.

While these types of roads are not what Autosteer was developed for, it is still surprising that 8.xx performs worse than 7.xx.

Telsla Motors: Wooded Road Autopilot Test Low Speed Firmware 8.xx (KmanAuto)

Tesla Motors: Wooded Road Autopilot Test Low Speed Firmware 8.xx (KmanAuto)

According to a comparison between 7.xx and 8.xx on the same road (8 video above, older 7 below), the new firmware struggles more to keep car on track.

We assume that Tesla has focused its priorities, and use of sensors, to improve Autopilot on major roads and highways.

The side effect of that effort, it would seem, are worse results in other areas/types of driving situations.

To achieve better results Tesla probably will simply need to add more scanning equipment on-board.

Tesla Motors: Wooded Road Autopilot Test Low Speed Firmware 8.xx
=================================================
An update to a previous test on the same road. I did this test on firmware 7.xx about 6 months ago. Now, with Firmware 8 and it’s claimed improvements, I did it again, and it did not fare too well. In my opinion, as the driver, it did worse then previously on the 7.xx firmware’s.
You can see the Firmware 7.xx Attempt at https://www.youtube.com/watch?v=S-LP0…

Personally, firmware 8 has been a train wreck of bugs and regressions. Anyways, judge for yourself. Conditions were fairly similar between both tests.

My Quote on the Autopilot Performance in Firmware 8:
“In general, some improvements have been made, yet, also some major regressions. One major improvement is the “X-Ray” Radar, that is able to see ahead of the vehicle in front of you. This seems to have sped up reaction times to sudden braking and stops. Especially prominent in Close quarters stop and go traffic, or when approaching a standstill on the freeway while at speed.
However, I have noticed regressions in performance as well. On roads I travel excessively, and which previous autopilot worked flawless (or as flawless as one could hope for the current hardware suite), the vehicle is now having considerable problems keeping in its lane, despite a good positive hard lock on either one or both lanes. It wonders and “pings” back and fourth considerably.

Though, it performs like it did previously when following behind another vehicle which it can also clearly see.

Rain performance is a no go. I have shown some awesome autopilot videos in downpours previously, where as in much better conditions I cannot even get a lane or vehicle lock.
Roads where lanes split (Ex, right lane makes a right break away, while the left lane continues straight on free way), if I’m traveling in the break away lane, with a good solid lock on both lines, instead of following my lane (which both lane lines beer right) the car will continue traveling straight right through the line (and almost into concrete barrier in some cases). While I know autopilot was not designed for the road in this video, it provides a safer, worst case scenario road, that I’m able to mostly recreate conditions. Sun, weather, lines, traffic, speeds). Testing on this road, I know it will generally “fail” in its ability to fully navigate me, however, it also allows me to show how each version of the firmware and autopilot (and in the future, full autonomous) can handle the same failure situations. also weather one version is more failure prone in comparison to others.
or improved and handle additional situations that previous (or possibly future) versions cannot.”

source: Teslarati

Categories: Tesla, Videos

Tags: ,

Leave a Reply

28 Comments on "Tesla Autopilot 8.0 Versus 7.0 On Same Winding Two-Lane Road – Videos"

newest oldest most voted

Holy crap, a guy blowing a .24 BAC level might be able to drive better than that!

Welcome to Autopilot, software that puts legions of “drunk” drivers on the roads, because it’s being used in ways it was never designed for, and Tesla refuses to disallow such dangerous use. Tesla even knows which roads it is unsafe on (it reduces AP speeds on these roads) yet they will not restrict it to highway-only. They’re playing games with people’s lives to collect more data. Textbook negligent design.

It’s a bad idea for that man to test that Tesla on a two lane road like that. In that due to all the sidewalks and crosswalk signs there are a lot of pedestrians in that area. Also a lot of traffic.

That guy should have tested that car feature on a rural two lane highway away from a lot of houses and traffic.

Then why does Tesla still insist calling it “autopilot”.

Wouldn’t you think something like “super intelligent cruise control” would have been more accurate instead of “autopilot”.

(Yes, please save me the airplane argument on the topic of autopilot).

That would be SICC!

Maybe you are right on thinking that Tesla car know where it is (the person says that the road was the same he used to drive his Tesla), so the learning thing of Tesla cars would know that he was using autopilot in an inappropriate road to do that, or at least, to do that and being over confident in car capacities, and then so being lazy as we have saw in so many videos.
Now, Tesla is trying to avoid more accidents in these roads, on making the car doing ping-pong between lines that will make all the drivers keeping their hands in the wheels and being very vigilant.

That seems odd, may he has faulty cameras? He should definitely make them aware.

Making them aware it doesn’t work where it’s not meant to be? I think they already know that. That’s just the answer he’d get.

Another Euro point of view

I see no problem with this autosteer no being able to deliver a job it was not designed for to start with but shouldn’t it then right away detect it , give prior warning and just shut down ? Maybe it does but this video seems to show that it is possible to force it into doing something dangerous.

I’ve noticed similar results with my Model X

The cameras pickup the lane markings, which is why I’d surmise V8’s radar is allowing more side, to side. We’re better off, if approaching an object, I bet, but that’s at the expense of what the cameras gave us (better in-lane). Tesla will figure it out.

The “hide the controls” thing. Now, that has me pissed! 🙂

I’ve noticed that Tesla has poor hardware QA, but software QA is a hundred times worse. Finally, I can have the reliability of crappy desktop software in my mission critical car. Why didn’t anyone think of this before?

Didn’t AP get new hardware and the price went up like $500? One theory could be they prioritized their efforts for the new sensors?

Just thinking, I had an iPhone 6s, downloaded iOS10 the day it came out and both speed and battery life suffered over iOS9.XX…Then I get the iPhone 7 preloaded with iOS10 and it’s a rocket ship with double the battery life…

No. Mobileye was extorting Tesla by drastically raising the price of their hardware… One of many reasons why they since parted company.

Yup, at this point you’ll get less argument from Apple users who suspect the company of gimping prior generation hardware with new software. That’s the other edge of the OTA update sword. So far, I don’t believe Tesla is deliberately going this route. They’re just being neglectful. That’s all.

Tesla should be careful with this auto feature in that if one of these cars goes slamming into a group of young kids it could in theory do major damage to Tesla.

Another question I have is if I run someone over how much of this is part Tesla?

Foremost it would damage the group of kids. Tesla should finally take some responsibility before even more happen. At least a Software restrictions to highways should be a none issue, the system relies on accurate maps anyway.

Second what Bladd said- the driving motivator should always be protecting life, not the brand!

If it’s driven into kids, the fault lies with the driver, who is ultimately responsible for the operation of the vehicle. When will you Americans stop blaming the autopilot and start blaming the idiot at the wheel who ignored specific instructions to ONLY use autopilot on highways?

So, if you misuse equipment despite the warnings that you have to acknowledge in order to use it–that somehow becomes the fault of the manufacturer and not the individual who intentionally misused it???

Yup. Welcome to the 21st century, in which a lot of people not only don’t mind the Nanny State, they actually prefer it.

I may never be able to drive a Tesla car, but if I do, I certainly won’t appreciate being treated like a child, rather than like an adult who’s responsible for the consequences of his own choices.

If you (driver) don’t believe it’s safer to enable autopilot than driving yourself, no one is forcing you to activate the feature in your Tesla.

Wow, for me – someone who has not tried tesla’s autopilot, but only heard about it and seen some of it’s marketing, this was surprisingly bad.

Well Ken, when it is used properly and legally as Tesla instructs, i.e–on divided highways, it works well:

When it is mis-used, as in this video–it doesn’t work very well.

In other words, don’t misuse equipment like trying to trim your fingernails with a chainsaw and you will be ok.

My point is that the discrepancy between the perceived functionality and the real functionality is problematically big. For me this is a bis point deduction for Tesla.

The information that is given to you later in the process is a different topic. And this is not equally problematic.

It worked exactly the way it is supposed to work:

#1 When it has clearly visible marking lines on the road to follow it works.

#2 When it doesn’t, it can’t work properly.

In other words, it was designed to operate on divided highways with clear land markings and not on poorly maintained/marked secondary roads like this “test” was conducted on.

If you misuse equipment then it is being set up to work poorly or even fail.

The article says:

“While these types of roads are not what Autosteer was developed for, it is still surprising that 8.xx performs worse than 7.xx.”

That’s illogical. If the functionality of something is improved, then the functionality is improved towards its intended purpose. At best, there would be a 50-50 chance that when used for something other than its intended purpose, the functionality would deteriorate. Actually, one could argue that the chances are even worse than 50-50, because the better something functions for a specific purpose, the less well it’s likely to work when used for something else.

For example, if you improve the functionality of a hammer, then odds are it will be even less usable as a screwdriver.

I’m sure that if and when Tesla has developed Autopilot/AutoSteer to the point that it is intended to be used on roads with two-way traffic, they’ll let us know. Until then, those drivers who insist on using it in other places, have only themselves to blame when the results are less than optimal.

And really, why is it even necessary to point this out? Isn’t it perfectly obvious? >:-/

Well I could not see the lines in places and I would not expect it to track it at all on that road.