What does the world look like in Tesla Autopilot Version 9?
We've seen many different videos showing what Tesla Autopilot "sees." This is due to the work of people in the Tesla community with the know-how to hack into the system and capture the camera feeds. This newest video is not only unique since it's the first to show Software Version 9, but it also lets us see six feeds all at the same time.
The new software update finally allows Tesla's neural net work with all cameras. While we can't say definitively how much Tesla Autopilot has improved as a result of the update, you can clearly see that the cameras are taking in an impressive amount of the environment surrounding the car, and the technology seems to "see" and identify what it's supposed to. Now, it's just a matter of time before all of this valuable data can be processed to a level that assures vastly improved results.
If you've used Tesla Autopilot with the new Software Version 9, we'd love to read your impressions in the comment section.
Video Description via greentheonly on YouTube:
Seeing the world in autopilot v9
This is running 18.39.6 firmware. All autopilot cameras are utilized.
This video does not show "narrow" stream because it's somewhat redundant with main.
You may notice certain choppiness on all the cameras other than "main" - this is because while main camera is captured at 36fps and is therefore very smooth, the "fisheye" is only captured at 6fps and the sides are at 9fps (the car captures all cameras at 36fps and backup camera at 30fps) - I needed to limit the rate of the intake to not overwhelm my storage device (it is still struggling a bit and that's why sometimes framerate drops on some cams).