Tesla Provides Autopilot Update: 100 Million Miles Driven

MAY 25 2016 BY MARK KANE 31

Tesla Model S

Tesla Model S

Experience Tesla Autopilot - Feel for yourself how Autopilot keep you stress-free and centered with a test drive

Experience Tesla Autopilot – Feel for yourself how Autopilot keeps you stress-free and centered with a test drive

Sterling Anderson, Director of Autopilot Programs at Tesla, revealed from the EmTech Digital conference in San Francisco that Tesla cars have already covered 100 million miles with Autopilot on.

For the most part, Teslas EVs are obviously driving with Autopilot off, but data acquisition software is working. so Tesla collected total 780 million miles of data.

Autopilot was first introduced in October of 2014, and then wide-released with the company’s 7.0 update about a year later.

According to Tesla, Autopilot helps to keep the vehicle in lane center better than average manual driving.

Steve Jurvetson (Tesla Board of Directors) commented (via Facebook):

“Awesome Autonomous Autos this morning at ‪#‎EmTechDigital‬ with Tesla, Google and Optimus. I heard several details here shared publicly for the first time.

Tesla now gains a million miles of driving data (comparing human to robot safety) every 10 hours. So they log more miles per day than the Google program has gathered since inception. Tesla’s design goal is to be 2-10x better than human drivers.”

From Optimus: “The average Chinese driver spends 3 years of their life looking for parking.”

“We consume 2.9 billion gallons of gas per year just from being stuck in traffic”

“Vehicles are used less than 5% of the time.”

source: The Verge

Categories: Tesla

Tags: ,

Leave a Reply

31 Comments on "Tesla Provides Autopilot Update: 100 Million Miles Driven"

newest oldest most voted

“According to Tesla, Autopilot helps to keep the vehicle in lane center better than average manual driving.”

That hasn’t been my experience, but perhaps I’m a better than average driver. Particularly around construction, or near exit ramps, Autopilot behaves poorly, if not dangerously.

It’s presumptive of Tesla to claim that Autopilot centers the car more precisely than a human, when that measurement itself was taken by Autopilot, not by a third-party testing agency. Of course Autopilot thinks it’s doing a good job! The question is whether its own perception is accurate.

Yes, ’cause Tesla AI lies to itself? AHAHAHAHAHA!!!

You funny. 😀

This can happen for many reasons:

– input lag in processing or in actuating the turn
– optimizations by a human driver to avoid unnecessary lateral forces
– incorrect lane approximations due to noisy signals
… etc, etc.

But by far the single biggest source of potential error, in my opinion, occurs when the car encounters an exit ramp. In this case Autopilot attempts to stay in the middle of what it thinks is a wider lane, when, in fact, it is the beginning of two lanes. A human driver would never do this, but the Autopilot may classify this behavior as “human error”, as clearly the human isn’t in the center of the new mega lane. Silly human!

I think you seriously underestimate the software dev team at Tesla.

As for special cases where lanes broaden to split or merge, those are being ironed out using data collection and software improvements. Autopilot is still ‘beta’, but will continue to improve over time.

Given that I use Autopilot daily, I don’t believe I do. It works very well in limited contexts.

However, it still has problems and biases, and it’s very likely that the data being trumpeted above, in typical Tesla self-congratulatory fashion, is a product of those problems and biases.

I’m also curious to see X-axis labels on that lane departure plot.

AP is great, but I think you seriously underestimate Tesla’s marketing team.

This is becoming a running joke. Seeking Alpha professional (well… he makes money off it, anyway) stock short-seller and eternal Tesla Basher Mark B. Spiegel aka “Logical Thought” aka “Four Electrics”, is now pretending he owns a Tesla car so he can have even more excuses to bash Tesla with made-up FUD?

Wow. Just wow. It’s amazing what bilgewater he apparently thinks we’ll swallow.

Yes, that’s clearly the simplest explanation, by far.

I seem to be getting a lot of doubt on these forums:

In case it wasn’t obvious, I should mention that I did not, in fact, let the car make contact with the barrier 😉

That’s a shame that the barrier didn’t help it decide that the right line was probably the one to choose.

I’m not sure I would have let autopilot have a go on that exit anyway but I guess you were monitoring it closely as you did not let it crash!

I wasn’t taking the exit; I was just in the rightmost lane.

And that’s why partial autonomy is terrifying. If the car makes that kind of mistake once per year, most people will relax and stop paying attention long before the car makes the error.

And, of course, human drivers *never* crash into barriers!

The auto pilot program thinks now? My experience has been that the system is continually improving as more miles accumulate and driving scenarios populate the system. It does “learn” so there is some level of AI and it does get “better” as more miles are accumulated. It’s not perfect by any means but then again we know for sure, 100% sure, that humans cause accidents, that over 95% of automotive deaths are the direct result of human mistakes.
I for one am willing to see how much safer the computer can make our cars. Hasn’t technology (in general) made our cars infinately safer than they were 50 years ago?

I agree.. this information is utter garbage… the AP in my vehicle is on ONLY on the easiest stretches of roadway because my experience with it on other sections where it is really put to the test.. it fails miserably and I must take over.. sometimes it is downright scary. . and I will NOT engage the AP again in any situation anywhere close to it once that occurs.. I drive the car manually 90% of the time.. most of these guys making comments here don’t even have a Tesla.. and I do.. and I can state unequivocally that the system is riddled with shortcomings. .. and anyone that trusts it is gonna eventually regret that trust.. I have 5200 miles on my 2016 MS 90D.. and that AP is another gimmick. . The adaptive cruise control also stinks.. it brakes too rapidly.. and causes issues changing lanes on the highway with it’s slowing .. it also reads vehicles in other lanes and slows for no reason at times.. very annoying..I wish I could disable that too and Ijust have standard cruise control back llike in my other cars…

My experience of AP after a similar number of miles (about 3 thousand of which have been done using AP) is completely the opposite. Sure, there are some issues, but the only serious one that has given me any real cause for concern is when the car encounters a brow in the road steep enough that the camera loses sight of the white lines on the other side of the brow. On a straight it isn’t an issue, but on a bend, the car just goes straight on, cutting into the next lane. That said, the ultrasonics would still prevent a collision with an adjacent car but without any traffic or barrier on the relevant side, it is quite an exciting experience.

I love it, only, I do wish it had a setting to restrict its top speed to that of the current limit – getting one speeding ticket after 25 years of having none at all was bad enough, but 2!

Well, if you compare the number of crashes attributable to Tesla’s AP per 100 million miles (0 – ever) to that for vehicles being driven by humans (~150 annually) it kind proves the point!

Note in the graph above, that the human, CHOOSES to offset themselves slightly more right of absolute center, than an algorithm that defines center purely by mathematics, and not prior, defensive driving experience.

Leaving less of your left side exposed while allowing others to pass– isn’t a bad idea. I would prefer the software to be more like the human, in this regard.

Lane variance would be expected to be significantly less imprecise than a human, all varibles considered. But that’s not what usually causes fatal accidents. If it did, most of us would be dead by now.

+1 My thoughts exactly

In addition to lane centering in straight highway situations, I would like to see AutoPilot use speed and lane position to keep the balance/traction of the vehicle more stable on corners.

Keep in mind I have zero AutoPliot miles myself, but I always choose to have cruise control turned off if there are curvy sections of highway. I can feel the lower level of grip because cruise control doesn’t modulate speed to properly setup the vehicle for the corner ahead.

I guess it is the AutoX driver in me that I can’t be turned off.

There have been at least 2 improvements to AP with respect to dealing with cornering that I am aware of. It now anticipates sharper bends (and roundabouts) and slows down accordingly. I think it does a pretty good job. I don’t know if it works differently according to road surface/weather conditions yet but I would like to think so. It would be easy to measure the differential slip between the 4 wheels to gauge tyre/road surface friction and add extra slowing to deal with slippery conditions.

Totally disagree!
A perfectly centered car in a lane is what we need.
Road leave ample space to everyone to pass or overtake
From a cyclist POV that is tired of being rub out off the road by those defensive, very defensive for themselves without reason that squeeze anything at their right!
I use my car without squeezing anyone.

I don’t have autopilot, but I do have ACC which doesn’t do lane centering. I turned off the dang lane departure warning. It was very annoying with Texas lines. The issues I have with automated lane centering include: – determining what is ‘center’ as comments above. Texas has very bad lines in some areas. Lanes unexpectedly merge into a single lane as well. – avoiding potholes in the part of the lane where tires would hit if exactly centered. – using both lines to ‘center’ seems counter-intuitive. Human drivers tend to hug one side or the other depending on conditions and lane choice. A robot driver would just calculate the exact center and stay there. – For U.S. cars, I would probably program to track the line that is NOT the edge of the road and calculate a good margin that may not be exact center. – Other human behavior. There is a piece of road here where the lines don’t match up with the way the road was built. The humans hug the turnlane and pretty much make two marked lanes into three driving lanes. A computer couldn’t do that easily. However, observation of other drivers would make it… Read more »

You are thinking exactly like me (comment above), but have organized the thoughts better.

I believe all of the inputs tohe Tesla Autopilot system are there to correct all of the improvements you have stated. It will just be a matter of time and priority for Tesla to implement them.

I like this steady drumbeat of good data that Tesla is releasing. This is what needs to be done to get the public and politicians to accept this new technology.

Good data? Or sales spiel? Spin doctors bro’.. get in the car.. let the AP drive.. you’re gonna see it ain’t all that.. I drive my Tesla manually 90% of the time cause that AP is scary sheeet.. only for lonesome stretches of highway.. this information is sales garbage.. take it from an owner/leasee..

Reading the title I thought this article would be about a software update for the autopilot…

So the roads will be worn twice as fast since all cars drive in exactly the same tracks?

Like Doctor Reed Richards, that is quite a stretch.

“So they log more miles per day than the Google program has gathered since inception. ”

Thus the proof that Tesla owners are beta testers… =)

Or they are crowd sourcing testers who pay to be the part of beta testing program.

My understanding is that they upload new software to cars on a regular basis and then “dry run” the autopilot, allowing them to test the software under real world conditions before rolling it out.

This is clever, but it’s not “real” testing, in that the autopilot does not have control of the car, so you can only exercise various passive code paths. This makes it difficult to tell if the software would do a good job. If it does bad job, and it’s live, the driver will override it, but when it’s trapped in the box, there’s no such signal.