Tesla’s version 12 update for its Full Self-Driving software (FSD V12) might herald a new era for semi-autonomous driving. CEO Elon Musk was live on X, formerly Twitter, a few days ago testing a Model S equipped with FSD V12. What we saw and heard from the head honcho indicated how far the brand has come in developing its artificial intelligence-driven autopilot.
The live stream was pixelated, and Musk sat behind the yoke steering of the Model S, recording with a phone in his hand, accompanied by Ashok Elluswamy, the head of the autopilot software team. The electric sedan chauffeured the two on Palo Alto’s tree-lined streets on a sunny day during rush hour, around the brand’s engineering headquarters.
Tesla’s FSD V12 works like a human brain, using neural nets and eyes (cameras), said Musk. There’s no line of code instructing the vehicle to slow down for speed bumps, to give clearance to cyclists, or to stop at a stop sign, he added. Instead, the system trains using videos from millions of Teslas already plying on the streets globally and mimics human drivers.
The eyes of the system are eight cameras, with a frequency rate of 36 frames per second. The modules can theoretically run at 50 fps but are limited by the cameras, indicated Musk. He also mentioned that test-driving was underway across continents, including New Zealand, Thailand, Norway, and Japan. Musk also added:
You definitely need a lot of training data to make it work. You need millions of dollars or training hardware. And you need to run the neural net training hardware. It’s not easy. The mind-blowing thing is that there’s no line of code.
At one roundabout, the Model S waited for two approaching cars to pass and resumed driving when the roundabout was clear. It also appeared to stick to lanes, recognize pedestrians and cyclists, and make active decisions. Musk also explained how different V12 was from V11:
There is no line of code that says there is a roundabout, which is what we have in the explicit control stack in version 11. There are over 300,000 lines of C++ in version 11, and there’s basically none of that in version 12.
It wasn’t all perfect. At one intersection, the Model S slowed down at a stop sign instead of halting completely and drove over the painted stop marker before continuing. Only 0.5 percent of humans actually stop at stop signs, Elluswamy said citing the data Tesla analyzed.
Tesla artificially trained the system to obey the stop sign, at the insistence of the regulators, as per Musk. He also said that the quality of the data was very important and that large amounts of mediocre data do not improve driving.
At one point, the system appeared to be confused at a traffic stop and almost jumped a red light before Musk intervened, which indicates that FSD V12 still needs more training to be safe for public release. Musk also claimed that the system had the ability “to figure it out,” and it could “understand signs without reading.”
What do you think about FSD V12? Leave your thoughts in the comments.