This Video Reveals What Tesla Autopilot Actually Sees


What exactly is Tesla’s Autopilot system “seeing.”

This is not the first time we’ve seen and shared what Tesla Autopilot cameras “see.” However, this newest video takes it to another level, with interpretations of how the technology is processing and digesting the data it receives. Additionally, the creator of this video includes some clever assumptions about the system, which help to make it easier to understand the visuals.

Tesla shared early videos with recognition feed not long after it debuted its Full Self-Driving technology. However, the automaker hasn’t carried out its self-driving coast-to-coast trip and the software is not yet available. Since the automaker’s initial wave of videos, we haven’t received much more information.

ThanksĀ to the hacker community, we now have more detailed insight.

Video Description via greentheonly on YouTube:

Overlaid with some data in addition to the usual circles:


Color of the circle represents the type: green – moving, yellow – stopped, orange – stationary, red unknown.


Size of the circle represents the distance to the object (NOT THE SIZE OF THE OBJECT). If my assumptions are correct you should imagine that the circle has about 1.5~2m diameter in 3D space in place of the object (but it does not get smaller than 100 pixels).


When it is drawn with thicker line that means it has a label and most likely the object is being tracked. When it fades in and out it is caused by changing probability of existence.

Source: Electrek

Categories: Tesla, Videos


Leave a Reply

7 Comments on "This Video Reveals What Tesla Autopilot Actually Sees"

newest oldest most voted

If it sees all critical objects, why does it have this crazy tendency to just lock onto the vehicle ahead or the perceived lane markings, allowing a crash into a large solid stationary object at high speed?

I think that can be mostly explained by Elon’s stubborn refusal to use LIDAR tech.

True, but then neither is any other auto maker putting active lidar scanners into its production cars, yet. Cadillac Super Cruise has the same limitations; at highway speeds it won’t brake for stationary obstacles, either.

People expect more from Tesla cars just because they are Tesla cars. In this case, expecting Level 3/4 performance from Tesla Autopilot+AutoSteer, which is an advanced Level 2 system, is expecting far too much.

Uber autonomous was equipped with LIDAR and of course you know what happened

An entirely relevant quote:

A natural reaction to these incidents is to assume that there must be something seriously wrong with Tesla’s Autopilot system. After all, you might expect that avoiding collisions with large, stationary objects like fire engines and concrete lane dividers would be one of the most basic functions of a car’s automatic emergency braking technology.

But while there’s obviously room for improvement, the reality is that the behavior of Tesla’s driver assistance technology here isn’t that different from that of competing systems from other carmakers. As surprising as it might seem, most of the driver-assistance systems on the roads today are simply not designed to prevent a crash in this kind of situation.

Sam Abuelsamid, an industry analyst at Navigant and former automotive engineer, tells Ars [Technica] that it’s “pretty much universal” that “vehicles are programmed to ignore stationary objects at higher speeds.”

Cool vids… hope to see it get more sophisticated by the month these days as it begins to teach itself fleetwide what is important and what to ignore.