Despite Adversity And Impeding Issues, Tesla Autopilot Will Prevail


MAY 20 2018 BY EVANNEX 17


Tesla took an early lead in the race to develop vehicle autonomy, and its Autopilot system remains the state of the art. However, the technology is advancing more slowly than the company predicted – Elon Musk promised a coast-to-coast driverless demo run for 2018, and we’re still waiting. Meanwhile, competitors are hard at work on their own autonomy tech – GM’s Super Cruise, is now available on the CT6 luxury sedan.

*This article comes to us courtesy of EVANNEX (which also makes aftermarket Tesla accessories). Authored by Charles Morris. The opinions expressed in these articles are not necessarily our own at InsideEVs.

Check This Out: Tesla Autopilot Deals With Merging Semi With Ease

Watch This: Autopilot Saves Tesla Model X From Collision

Instagram: deanredline

Is Tesla in danger of falling behind in the self-driving race? Trent Eady, writing in Medium, takes a detailed look at the company’s Autopilot technology, and argues that the California automaker will continue to set the pace.

Every Tesla vehicle produced since October 2016 is equipped with a hardware suite designed for full self-driving (SAE Level 5 autonomy), including cameras, radar, ultrasonic sensors and an upgradable onboard computer. Around 150,000 of these “Hardware 2” Teslas are currently on the road, and could theoretically be upgraded to self-driving vehicles via an over-the-air software update.

Tesla disagrees with most of the other players in the self-driving game on the subject of lidar, a technology that calculates distances using pulses of infrared laser light. Waymo, Uber, and others seem to regard lidar as a necessary component of any self-driving system. However, Tesla’s Hardware 2 sensor suite doesn’t include it, instead relying on radar and optical cameras.

Lidar’s strength is its high spatial precision – it can measure distances much more precisely than current camera technology can (Eady believes that better software could enable cameras to close the gap). Lidar’s weakness is that it functions poorly in bad weather. Heavy rain, snow or fog causes lidar’s laser pulses to refract and scatter. Radar works much better in challenging weather conditions.

According to Eady, the reason that Tesla eschews lidar may be the cost: “Autonomy-grade lidar is prohibitively expensive, so it’s not possible for Tesla to include it in its production cars. As far as I’m aware, no affordable autonomy-grade lidar product has yet been announced. It looks like that is still years away.”

If Elon Musk and his autonomy team are convinced that lidar isn’t necessary, why does everyone else seem so sure that it is? “Lidar has accrued an aura of magic in the popular imagination,” opines Mr. Eady. “It is easier to swallow the new and hard-to-believe idea of self-driving cars if you tell the story that they are largely enabled by a cool, futuristic laser technology…It is harder to swallow the idea that if you plug some regular ol’ cameras into a bunch of deep neural networks, somehow that makes a car capable of driving itself through complicated city streets.”

Those deep neural networks are the real reason that Eady believes Tesla will stay ahead of its competitors in the autonomy field. The flood of data that Tesla is gathering through the sensors of the 150,000 or so existing Hardware 2 vehicles “offers a scale of real-world testing and training that is new in the history of computer science.”

Competitor Waymo has a computer simulation that contains 25,000 virtual cars and generates data from 8 million miles of simulated driving per day. Tesla’s real-world data is of course vastly more valuable than any simulation data could ever be, and the company uses it to feed deep neural networks, allowing it to continuously improve Autopilot’s capabilities.

A deep neural network is a type of computing system that’s loosely based on the way the human brain is organized (sounds like the kind of AI that Elon Musk is worried about, but we’ll have to trust that Tesla has this under control). Deep neural networks are good at modeling complex non-linear relationships. The more data that’s available to train the network, the better its performance will be.

“Deep neural networks started to gain popularity in 2012, after a deep neural network won the ImageNet Challenge, a computer vision contest focused on image classification,” Eady explains. “For the first time in 2015, a deep neural network slightly outperformed the human benchmark for the ImageNet Challenge…The fact that computers can outperform humans on even some visual tasks is exciting for anyone who wants computers to do things better than humans can. Things like driving.”

Instagram: maxbont63

By the way, who was the human benchmark who was bested by a machine in the ImageNet Challenge? Andrej Karpathy, who is now Director of AI at Tesla.


Written by: Charles Morris; Source: Medium

*Editor’s Note: EVANNEX, which also sells aftermarket gear for Teslas, has kindly allowed us to share some of its content with our readers, free of charge. Our thanks go out to EVANNEX. Check out the site here.

Categories: Tesla


Leave a Reply

17 Comments on "Despite Adversity And Impeding Issues, Tesla Autopilot Will Prevail"

newest oldest most voted

It’s interesting how the article expresses a concern that Tesla is or may be falling behind in autonomous driving- considering how everyone that first rides in my AP1 car is still experiencing any level of lane-centering driving for the first time. Point being, its still SO new that I’m not sure that any differences beyond current Autopilot, Pro Pilot, and Supercruise really matter at this point and for a number of years. Simply grasping a steering wheel that steers without input, regardless of the brand, seems like it will take years for the masses to become accustomed to.

Just one man’s opinion..

AP1… Is early Mobileye tech, currently the improved version of Mobileye is available on so many cars now (over 200 models according to Mobileye website) including Jaguar I-Pace, and Leaf in EV’s.

Supercruise is really the gold standard at this point as it is not Beta, and a truly hands off experience.

“As far as I’m aware, no affordable autonomy-grade lidar product has yet been announced. It looks like that is still years away.”

Audi A8 comes equipped with Lidar, already on the road today.
Lidar is made by Valeo.

Audi is indeed first with LIDAR in a container car. But the Valeo unit only has 4 lines of resolution, I believe, and is forward-facing. Most self driving LIDARs have 64 lines with a 360 degree view. Wayne has 5 LIDARs, iirc.

We’re still 18++ months away from fully functional LIDARs that are small enough and cheap enough for consumer cars IMHO.

I believe the main problem is lack of binocular vision. It requires two or more cameras looking at the same scene using edge detection software to triangulate and calculate distance to an object. Cameras are much higher resolution than LIDAR, and therefore in my opinion a superior option. But only if Tesla installs more cameras.

People – like me – can drive without stereoscopic vision, so it’s not absolutely necessary.

However, I think the future is in not mimicking human vision, but using the most suitable technology (that humans have not).

You can move your head if necessary to get a better idea of distance. You will never convince me that one eye is better than 2.

@Magnus, A little experiment, using just one eye, reach out with your hand to a point several inches outside your monitor. Extend one finger towards the monitor. Without moving your head, bring your hand in and touch the top corner of your monitor. When I tried to do this, I missed first time.

Tesla installs 8 cameras on all of their vehicles. I have my doubts whether that will still solve the issue. Don’t be surprised if we hear another humans are underrated with regard to Tesla’s current solution to autonomous driving.

The main problem is the software. Tesla is FAR behind there, and their lack of LIDAR means they must solve a much harder problem.

The problem with current camera sensor technology is dynamic range (IE range of dark to bright the sensor can handle). You can see this for yourself by riding around in a car either at night or during a sunny day on a tree covered road or other very high contrast scenario while recording video on the best camera you can get your hands on. When you play the video back it will be missing either bright (such as a white truck against a bright background) or dark objects (pedestrian in the shadow of a tree). Software can’t process data that isn’t there. The best sensors available have about 12 stops (IE 2 to the 12th) less dynamic range than the human eye (which has about 25 stops but still requires us to where sun glasses in some cases). And the dynamic range for sensors has only improved by about 2 stops (a factor of 4) over the last 10+ years. There is a solution other than waiting for (probably a very long time) for sensor technology to improve: either use paired cameras to cover the range or increase the sample rate pulled off the sensor and combine images to improve… Read more »

Trent Eady writes on seeking alpha. He is a fervent Tesla Bull but neither an expert in autonomous driving nor in AI.

Its a myth, that Tesla is in a leading position wrt autopilot.
They are not even among the top 10, and falling more and more behind (last but not least because they are constantly loosing key personnel in this field).

They stick to the story of lidar not beeing necessary, because if they would concede. that lidar is necessary, they would face a huge problem with all the cars sold allegedly equipped with the hardware necessary for full autonomy. The rest of the world believes, Lidar will be key. There is no doubt, that lidar provides much more safety, and the technology is becomung cheaper very fast (Time of Flight Cameras combining optical images with lidar).

I don’t agree with the requirement of LIDAR, but I certainly agree with the bias in the article. He provides no proof or solution only a clear faith that Tesla will solve their problem.

Tesla’s expectation that drivers will use “autopilot” with hands on the wheel is basically absurd, against everything we know of human nature.

A wonderful company deploying wonderful technology but this denial of human nature is a puzzling aberration from an outfit otherwise driven by facts.

“Is Tesla in danger of falling behind in the self-driving race?”

No. Tesla is already FAR, FAR behind. Waymo is moving passengers around Phoenix with nobody in the driver’s seat. GM/Cruise is about to follow suit. Meanwhile, Tesla blindly follows lane lines into stationary objects such as road dividers and parked fire trucks.

Tesla has even fallen behind in consumer vehicles. The Audi A8 mentioned above does not require any driver involvement in stop-and-go highway traffic. Tesla can also handles that situation pretty well, but not reliably enough to allow the driver to disengage.

So I work at a medical imaging company using deep learning for image segmentation for diagnostics. I am constantly amazed at how deep neural networks learn and make predictions that are already more accurate (higher sensitivity and specificity) that even expert clinicians. Humans drive visually, computers are rapidly approaching the time where they can see as well or better than humans.

I have a 3 year old son, and he will never need to learn how to drive.

According to several sources (google it) the cost for lidar has come down from $ 70,000 to $ 90 and continues to improve.
Lidar seems less suitable for depth detection and rain/snow. I’m curious if a combination of sensors is a better way forward or doesn’t matter as software / camera’s improve.