When you buy a Tesla Model 3 today, it comes with the Autopilot feature; you can pay $6,000 for Full Self-Driving package. Let’s start with Autopilot. Tesla has another, more appropriate name for it – assisted steering. It keeps the car in the proper lane and maintains distance (through automatic braking and acceleration) with the car in front of you. Even when Autopilot is not engaged, it works in the background, nudging the car to stay in its lane and automatically braking if the car detects that you are about to collide with another vehicle.
I have mixed feelings about Autopilot. It can be liberating. On our trip from Denver to Santa Fe, we used Autopilot 80% of the time, as most of the road was a clearly marked highway.
However, when you drive on a road that doesn’t have a clearly marked median, Autopilot is not to be relied upon and, in fact, can be very dangerous. On two occasions when the road curved and the lane markings were interrupted by an intersection, Autopilot almost took me into incoming traffic; I had to intervene.
Autopilot is almost ready for prime time
The problem is that the word “almost” should never be used in the same sentence with “Autopilot.” An almost-working Autopilot is like an almost-working airplane – it might lead to deaths. This is why self-driving reliability standards are measured four or five digits to the right of the decimal point. Autopilot has to be reliable 99.99999% of the time. Achieving the 9’s to the right of the decimal point is much more difficult than to the left of it.
In a few years, Autopilot will be very reliable and saving lives. But we are not there yet. Its presence may give people false confidence in the system and may cost lives. This is why Tesla wants you to keep your hands on the steering wheel even when Autopilot is engaged. If the car detects that you have not had your hands on the wheel for 30 seconds, it will start beeping. If you ignore the warning, the car will slow down and stop.
My advice to you: Listen to the car, not the people who created it. Elon Musk has on many occasions (here is one example) driven the car in media interviews without holding his hands on the wheel, implying that Autopilot is infallible. It is not – at least, not yet.
I use Autopilot only in bright daylight, on clearly marked roads and highways, and I keep my hands on the wheel. I use the hands-off approach only when I am stuck in stop-and-go traffic and moving at very slow speeds.
With full self-driving (FSD), the car should function just like an Uber driver to take you to the destination you enter in the GPS. Well, today’s FSD is anything but that. The car will change lanes on its own if you turn on a blinker. It will exit the highway for you if you have put your destination address in the GPS. With the exception of a few other gimmicks that are not yet fully functional, that’s about it for now.
The car industry breaks down the levels of self-driving like this: Level 1 (L1) – the car is completely operated by a driver; L2 – the car partially drives itself, but the driver is there as a fail-safe (this is basically Tesla’s Autopilot, or assisted steering). The scale tops out at L5, full self-driving, in which responsibility is completely removed from the driver.
Today Tesla and Waymo (Google’s autonomous driving unit) are the leaders in self-driving and have taken very different paths. Waymo has several hundred fully autonomous cars roaming the country collecting data so that software and hardware can be improved. So far, Waymo has collected data for 10 million FSD miles, which is an incredible achievement.
Compare this with Tesla’s approach. First, unlike Waymo, which uses LIDAR as the main sensor (think of it as radar shooting out light beams), Tesla uses eight cameras. LIDAR is extremely accurate, but it is bulky and very expensive. Elon Musk is not a big fan of LIDAR; he calls it “a fool’s errand.” Tesla’s cameras are very cheap, and they have long range and high resolution (great for machine learning), but they are not as dependable as LIDAR and don’t do well in extreme weather. This is why Tesla is supplementing the cameras with radar and ultrasonic sensors.
Tesla has over 400,000 vehicles on the road today that are collecting data on semiautonomous driving; thus far, it has compiled data for 1 billion miles. Like Waymo, Tesla is using machine learning to train its self-driving system.
I think FSD is a long-term goal and that we are going to have to settle on semiautonomous driving for a long while before FSD gets good enough to be trusted by society and regulators.
Let me tell you a story
I have a friend who is a radiologist. He works from the basement of his house, where he reads X-rays and provides diagnosis eight hours a day. Each day, he reads about 100 X-rays. About one to two of his daily X-rays are randomly selected and reviewed by his peers to catch mistakes. If you think about it, X-rays are a perfect medium to be run through machine learning. You have a very discrete dataset – humans are usually made up of the same organs. If you take a few million X-rays and mark them up (identifying tumors, etc.), machine learning should be able to do a great job of spotting cancer and other anomalies.
My friend, who is only a human, is prone to make mistakes early in the morning before he has had a few cups of coffee or at the very end of the day, when he is tired. Computers don’t need coffee, nor do they get tired, so in theory, armed with a proper algorithm, they should make fewer mistakes than humans.
However, our society may not be ready for computers to handle this life-and-death task. At least, not yet. And that is OK, but it doesn’t mean our advances in machine learning should completely go to waste. Instead of computers replacing radiologists, they should assist them. First of all, they should do peer review of all X-rays, as the incremental cost of this review is zero. They may catch radiologists’ mistakes, and if they come up with false positives, that will just generate more data points that will only make the AI smarter.
Second, computers can assist radiologists in real time by highlighting areas of possible concern. Instead of completely replacing radiologists, they can reduce errors and help radiologists do more. In the process of assisting, they’ll get better, and at some point (years and years in the future) they’ll be able to replace radiologists.
This is the route Tesla is taking. Despite Elon Musk’s promises, FSD and robo-taxis may be quite far in our future. But that doesn’t mean Tesla’s Autopilot cannot help drivers stay in their lanes and maintain a proper distance from the vehicles in front of them. In the meantime, Tesla is collecting millions of miles of data every single day from its growing fleet of cars. For instance, the Autopilot of my Model 3 will not stop the car at a red light or a stop sign, but it will alert me if I am about to run a red light.
However, Tesla is identifying those stop signs and red lights in what Musk calls shadow mode: Tesla identifies the good drivers among the 400,000 people who drive its cars (I am pretty sure yours truly did not make the cut). Then it compares the decisions the cars would have made with the decisions its good drivers made at intersections. Nobody else can do this kind of analysis today, especially on this scale.
Last year, close to 30,000 people died in auto accidents in the U.S., and three or four died while Tesla’s Autopilot mode was engaged. We are used to forgiving human failings, but we don’t cut our computers that sort of slack. This gap between humans and computers means that for FSD to be approved by regulators it has to be an order of magnitude better than humans in any weather and any road conditions. Despite Elon Musk’s optimism, we are far from that goal.
In the meantime, Tesla’s large and growing fleet of cars is constantly collecting data, and with every mile the cars get slightly better. This data and the algorithms that come out of it may turn into Tesla’s largest competitive advantage.
This is just one out of 11 parts of my analysis of Tesla, Elon Musk, and the EV revolution. You can get it as an email series, PDF, EPUB or Kindle ebook here or email at firstname.lastname@example.org.