YouTuber Fools Tesla Autopilot With A Painted Wall (Updated)
Call it the "Road Runner test." And it shows the weaknesses of Tesla's camera-only approach to autonomous driving.
Update: As The Verge pointed out today, Rober's video contains a number of potential issues. Those include the apparent lack of Autopilot being switched on during portions of the test, the use of multiple takes and the fact that Luminar itself promoted the video. We have reached out to Rober for comment and will update if we hear back. Our original story follows below.
Tesla CEO Elon Musk is adamant that the company's future isn't so much matching competitors like BYD at making electric vehicles—it's in cars that can fully drive themselves. Yet Tesla's approach to autonomous driving technology is a unique and controversial one because it relies solely on cameras and artificial intelligence, not additional sensors like radar and lidar.
But a new video from YouTuber and engineer Mark Rober is a good illustration of the weaknesses of this approach. On his channel, the former NASA engineer and CrunchLabs founder outlines a test that's lifted straight from an old Looney Tunes short: can a fake, painted wall "fool" a Tesla on Autopilot into crashing?
Unfortunately, since Tesla's engineers haven't yet invented a way to zoom through the fake tunnel like the Road Runner, the car does, in fact, slam right into the wall like Wile E. Coyote would.
The relevant testing begins midway through this video. For comparison's sake, Rober enlists the help of a lidar-equipped Lexus SUV test vehicle from the company Luminar. At first, both vehicles pass the same test: stopping when a simulated child pops up in front of them during driving. Rober then shows the differences between how the two vehicles "see" the world.
But the real moment of truth comes later when Rober and his crew deploy the wall, which is painted up to look just like the road, sky and surrounding area.
The lidar-equipped Lexus detects the wall and stops ahead of it. The Tesla, unfortunately, does not. "I can definitively say for the first time in the history of the world, Tesla's optical camera system would absolutely smash through a fake wall without even a slight tap on the brakes," Rober said.
While Tesla's Autopilot and Full Self-Driving systems have improved dramatically over the years, they remain linked to dozens of crashes, some of them fatal, and have sparked a number of state and federal investigations. Yet many longtime autonomy experts say that Tesla's camera-based approach—which Musk touts because it's similar to how humans drive cars—will always be inadequate if the goal is to create self-driving cars that are safer and better than humans.
It's unlikely that you'll accidentally drive into any fake walls, since you aren't a cartoon character. But this video makes you wonder what else Tesla's camera-based system might miss if future cars don't even come with steering wheels or pedals.
Contact the author: patrick.george@insideevs.com
RECOMMENDED FOR YOU
Lucid Gravity Robotaxis Get California Permit That Tesla Hasn't Even Applied For
This New European Electric SUV Has A Secret From China
GM’s Next-Gen Super Cruise Is Training On ‘100 Years Of Human Driving’ Every Day
Unpacking The Beijing Auto Show, From The Xiaomi SU7 To The Ford Bronco EREV
Tesla’s Robotaxis Just Went Live In Dallas And Houston. Good Luck Finding One
3 Big Takeaways From The Beijing Auto Show
The Volkswagen ID. Buzz Robotaxi Is Hitting LA Streets