Andrej Karpathy, Tesla’s Director of Artificial Intelligence and Autopilot Vision, is one of the chief architects of Tesla’s self-driving vision. In July, he hosted a workshop on Neural Network Multi-Task Learning, where he offered some detailed insights on Tesla’s use of AI in developing its Autopilot features.

  • This article comes to us courtesy of EVANNEX (which also makes aftermarket Tesla accessories). Posted by Charles Morris. The opinions expressed in these articles are not necessarily our own at InsideEVs.

Above: A look at Tesla's Autopilot (Image: Tesla)

Now Karpathy is featured in a new video in which he describes how Tesla is using PyTorch, an open-source machine learning library, to develop full self-driving capabilities for its vehicles, including Navigate on AutoPilot and Smart Summon.

Karpathy explains that, unlike other companies working on self-driving, Tesla doesn’t use lidar or high-definition maps, so the Autopilot system relies on AI to parse information from the eight cameras mounted around the vehicle. Tesla is a fairly vertically integrated company, so it has control of the “full stack” when it comes to AI. The machine learning process is built around “hydranets,” so called because each has a shared backbone and multiple heads (like the Hydra of Greek mythology). Karpathy demonstrates how Tesla’s Smart Summon feature uses hydranets to figure out how to negotiate a parking lot.

Above: Andrej Karpathy discusses the development of Tesla's Autopilot and Smart Summon features (YouTube: PyTorch)

Karpathy’s talk gets very technical very quickly - only those with a background in machine learning are likely to be able to follow the full story here. However, even we laypeople can appreciate the incredible complexity of teaching a computer to drive a car. According to Karpathy, compiling a full build of Autopilot 2.0 involves some 48 different networks, 1,000 distinct predictions and 70,000 GPU hours. And of course, this is no one-time project - the software is continuously being improved, so it must be frequently re-compiled and updated.

This continuous improvement is driven by the massive amounts of data pouring in from the fleet of Teslas on the world’s roads - an asset no other company working on autonomous driving enjoys. Karpathy tells us that the Navigate on Autopilot feature has now accumulated over a billion miles of real-world usage in over 50 countries, including 200,000 automated lane changes. The Smart Summon feature has been used in over 500,000 sessions in the short time since it was introduced. Keep this figure in mind the next time some pundit declares Smart Summon a failure because of a handful of YouTube videos of comical parking lot mishaps.


Written by: Charles Morris; Source: PyTorch

  • InsideEVs Editor’s Note: EVANNEX, which also sells aftermarket gear for Teslas, has kindly allowed us to share some of its content with our readers, free of charge. Our thanks go out to EVANNEX. Check out the site here.
Got a tip for us? Email: