Cumulative Tesla Autopilot Miles Now Over 222 Million


Tesla software update 8.0 - Autopilot Enhancements

Tesla software update 8.0 – Autopilot Enhancements

A week or so ago, Tesla CEO Elon Musk posted an Autopilot mileage update on Twitter:tweet

That’s nearly a quarter billion miles, compared to Google’s 1.5 million miles of autonomous driving via its Lidar-equipped cars.

Autopilot 8.0 Sees Ahead

Autopilot 8.0 Sees Ahead

The Autopilot miles driven grow exponentially, with more and more Teslas joining the fleet every day around the world.

Teslarati reports:

“Tesla vehicles built since October 2014 log data from every mile driven, whether the car is operating in Autopilot mode or not. Even if an owner forgoes the purchase of Autopilot, the car still transmits wireless driving data directly to Tesla and into its cloud based machine learning network.”

Tesla analyzes this data to both train the vehicles to improve Autopilot behavior and to continue to update the operation of the various Autopilot systems.


That’s progress made possible by collection of data from lots of vehicles over time in all sorts of different situations.

Source: Teslarati

Categories: Tesla

Tags: ,

Leave a Reply

10 Comments on "Cumulative Tesla Autopilot Miles Now Over 222 Million"

newest oldest most voted

It is hard to directly compare AP miles with Google L4 autonomous miles. The miles are accumulated in different traffic conditions with different sensors and different drivers (assisted vs fully autonomous). It’s like comparing radar cruise control miles to AP miles.

You missed the point.

Tesla is rapidly increasing its AP mileage total much faster then Google or any of the other laggard auto OEMs and that database of those miles is being used to improve the software.

Its a treasure trove that gives Tesla a big competitive advantage going forward.

Once the Model 3 starts delivering in numbers with AP2 most likely then the data will truly grow exponentially which certainly scares the hell out of shorters/haters and the competition.

I don’t think he missed the point at all.

Autonomous driving doesn’t really get difficult until it sees someone at a crosswalk and tries to figure out from their facial expression and eye movement whether they are waiting for a bus or wanting to cross the road. There are 1000 problems just like this that need to be solved. You do this ten times a day while driving without even noticing.
All Tesla’s public miles with current AP hardware are useless at helping Tesla learn those. Google’s miles are directly aiming at that.

Tesla’s miles will make Tesla the best at freeway driving on cheap hardware… but only their internal driving on different hardware will start to meaningfully contribute towards L4.

Wow. Facial recognition and predictive analysis based on micro-expressions and body language? Really? How about this: A self-driving car needs to be able to recognize the presence of a pedestrian crosswalk, and observe traffic signals. It also needs to be able to recognize the presence of obstacles, including large moving obstacles such as pedestrians and other vehicles. It needs to use real-time vector analysis to avoid collisions with moving obstacles. It certainly does not need to be so needlessly complex that it’s trying to read someone’s mind to figure out if they plan on stepping out in front of a moving car, or otherwise trying to predict things based on such unreliable and difficult-to-perceive indicators. Now, I can see the utility of putting in a software routine to activate a noise-maker if the car detects a pedestrian crosswalk, and also detects pedestrians (or vaguely pedestrian-shaped objects) near the crosswalk but not actually in it, to warn them of the car’s quiet approach. “You do this ten times a day while driving without even noticing.” I may be able to figure out if someone is planning on crossing the street because of their movements, but certainly not because of the expression… Read more »

yes… the detailed movements and head direction … along with figuring out when they are waving you through is the minimum you need for reasonable driving.
Driving safely isn’t just about avoiding at-fault accidents but also not-at-fault situations… like a cyclist not giving way at a stop sign.

See for an example (more than one year ago). At that time the Google car could already tell the difference between a cyclist waiting at a stop sign and a cyclist starting to pedal… but a cyclist doing a track stand was classified as cycling so the car gave way.
I expect since that time the Google car is now better at identifying cyclists doing track stands.
If you want to be able to drive without a human (and safer than a human), you need to be able to deal with all circumstances.

That is irrelevant if Tesla’s autopilot is used at the same place over and over again by commuters going up and down the same highway every day. That usage doesn’t add much to the knowledge base. What Google and every other developers of autonomous driving does is trying to use it in as many different environments as possible, really adding to their knowledge.

That’s a fallacy. It would only be true if the car encountered exactly the same conditions every day, with no low-probability or “edge case” events ever happening.

Near-accidents happen with much greater frequency than actual accidents… fortunately! No matter where a car is driving, it can provide data from near-accidents and unusual events which could potentially help improve the operation of an autonomous car.

I have seen nothing that google’s sensors are reading pedestrian’s facial expression/eye movements.

This would of course involve cameras and Tesla uses those too.

In any case, Tesla will continue to upgrade their sensor suites and hardware and their existing database of miles logged will be eminently useful as will their rapidly growing amount of miles being added.

Tesla unlike google, much less the laggard auto OEMs is implementing and innovating at a faster pace then the others and this is aa big an advantage in the march towards eventual full autonomy as it is in the march to fielding and selling compelling BEVs.

I guess, having the ability to go Full Autonomous from on-ramp to off-ramp, is a first major step, and I would think the next step – is not necessarily dodging Kids Chasing Balls onto the Street, in a residential area, or people crossing the road, but rather – being able to handle twisty two lane roads, farm tractors mixed in with car and truck traffic, and so on! Basically, first you Nail Full Auto on Interstate Roads, then State Roads, Then County Roads, and Finally – City Streets! When a Tesla can drive Pikes Peak in a race, on Autopilot, and make a record time, we will know full Autonomous is quite close! It will have to lead about sudden road direction changes, grade changes, surface or traction changes, etc. to navigate such a course, as well as have precisely mapped the route, to within 3-4″ of lateral precision, if not even tighter tolerances! That might sound tight, but is likely already within the capabilities of Auto park, so it has a base line! I would like to hear what kind of rim/tire to shoulder curb separation distances are current, and how consistent that ends up, time by time, driver… Read more »

Since all of this data is available, how many of those miles were spent driving over 30 mph? I will bet not that many.
Teslas testing procedures do not represent ANYTHING. Show us ALL of the data.