Tesla announced its long awaited Autopilot v8.0 update today - a system that will now rely significantly more on radar (rather than cameras) that its predecessor (full Tesla press release/statement below).
The radar sensors required for this upgrade are, for the most part, already installed on Tesla Model S and Model X Autopilot-enabled vehicles; at least all those with a manufacture date of October 2014 or later with the functionality. Up until now the sensors included in the Autopilot package played 2nd fiddle to the on board cameras.
Autopilot In Trickier Locations Like The Aberdeen Tunnel Will Now Gain Knowledge From The Entire Tesla Fleet's Historical Use Of The Location
With the Autopilot update, radar takes over the main controls, which will better enable the car to travel in bad weather conditions than could be achieved with just a camera according to the company.
From the Q&A Tesla CEO Elon Musk said of the feature:
"Even if you're driving down the road and the visibility is very low and there's a multi-car pile up, the camera can't see it, but the radar would and apply the brakes."
Previously the company relied less on radar due to frequency of false positives being detected, something that would happen far less in earlier systems using a camera-based system.
The new radar system has a more detailed point cloud, which Tesla says unlocks access to six times as many radar objects, with a lot more information per object as well. Additionally the system will now take a radar snapshot every tenth of a second to determine objects, and whether they are standing still or in motion, as well as to exclude more "spurious reflections".
The Tesla fleet of cars will also now build a database non-threatening objects and landmarks: roads signs, bridge underpasses and other obstacles that might appear to Autopilot as having the potential for a collision event, but that received no response via its live human driver when encountered, in order to respond more appropriately in the future while activated.
"Initially, the vehicle fleet will take no action except to note the position of road signs, bridges and other stationary objects, mapping the world according to radar. The car computer will then silently compare when it would have braked to the driver action and upload that to the Tesla database. If several cars drive safely past a given radar object, whether Autopilot is turned on or off, then that object is added to the geocoded whitelist."
Updated Autopilot functionality will now be able to achieve "car ahead", 2 car tracking
Tesla has also stepped up the punitive action for not following the terms and conditions to operate Autopilot after some high profile accidents (sometimes fatal) have occurred when the driver was not properly governing the EV - something Elon Musk said on the call happens more frequently with "veteran" users.
With the Autopilot version 8 release, the auto-steering system will now disengage if the driver of the car ignores several warnings (3 of them over an hour) to keep contact with the steering wheel. And to re-engage the software, the car must first be pulled off the road and put into park to be reset.
“We're making much more effective use of radar. I am highly confident this will be a substantial improvement.” - Musk said on the conference called, while adding, "It is quite unequivocal that Autopilot improves safety, and with this update, it improves it even more."
During the call, the Tesla CEO was asked if the recent fatal crash, with Autopilot enabled, that involved Josh Brown's Model S and a semi-truck would have been prevented with this new radar system and new warning system, Musk stated that "we believe it would have".
As always, Musk was careful to state that Autopilot is not a guarantee of 100% safe transportation, 100% of the time.
“I do want to emphasize that this does not mean perfect safety. Perfect safety is really an impossible goal. It is about improving the probability of safety. That’s really all you can accomplish.”
No hard timeline was given for Autopilot 8's roll-out, other than "in the next week or two".
--- Full Tesla Press Release on AutoPilot v8.0
Upgrading Autopilot: Seeing the World in Radar
Autopilot 8.0 arrives shortly this September
While there are dozens of small refinements with Version 8 of our software, described in addendum below, the most significant upgrade to Autopilot will be the use of more advanced signal processing to create a picture of the world using the onboard radar. The radar was added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but was only meant to be a supplementary sensor to the primary camera and image processing system.
After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition. This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar. Photons of that wavelength travel easily through fog, dust, rain and snow, but anything metallic looks like a mirror. The radar can see people, but they appear partially translucent. Something made of wood or painted plastic, though opaque to a person, is almost as transparent as glass to radar.
On the other hand, any metal surface with a dish shape is not only reflective, but also amplifies the reflected signal to many times its actual size. A discarded soda can on the road, with its concave bottom facing towards you can appear to be a large and dangerous obstacle, but you would definitely not want to slam on the brakes to avoid it.
Tesla Model S w/ Autopilot
Therefore, the big problem in using radar to stop the car is avoiding false alarms. Slamming on the brakes is critical if you are about to hit something large and solid, but not if you are merely about to run over a soda can. Having lots of unnecessary braking events would at best be very annoying and at worst cause injury.
The first part of solving that problem is having a more detailed point cloud. Software 8.0 unlocks access to six times as many radar objects with the same hardware with a lot more information per object.
The second part consists of assembling those radar snapshots, which take place every tenth of a second, into a 3D "picture" of the world. It is hard to tell from a single frame whether an object is moving or stationary or to distinguish spurious reflections. By comparing several contiguous frames against vehicle velocity and expected path, the car can tell if something is real and assess the probability of collision.
The third part is a lot more difficult. When the car is approaching an overhead highway road sign positioned on a rise in the road or a bridge where the road dips underneath, this often looks like a collision course. The navigation data and height accuracy of the GPS are not enough to know whether the car will pass under the object or not. By the time the car is close and the road pitch changes, it is too late to brake.
This is where fleet learning comes in handy. Initially, the vehicle fleet will take no action except to note the position of road signs, bridges and other stationary objects, mapping the world according to radar. The car computer will then silently compare when it would have braked to the driver action and upload that to the Tesla database. If several cars drive safely past a given radar object, whether Autopilot is turned on or off, then that object is added to the geocoded whitelist.
When the data shows that false braking events would be rare, the car will begin mild braking using radar, even if the camera doesn't notice the object ahead. As the system confidence level rises, the braking force will gradually increase to full strength when it is approximately 99.99% certain of a collision. This may not always prevent a collision entirely, but the impact speed will be dramatically reduced to the point where there are unlikely to be serious injuries to the vehicle occupants.
The net effect of this, combined with the fact that radar sees through most visual obscuration, is that the car should almost always hit the brakes correctly even if a UFO were to land on the freeway in zero visibility conditions.
Taking this one step further, a Tesla will also be able to bounce the radar signal under a vehicle in front - using the radar pulse signature and photon time of flight to distinguish the signal - and still brake even when trailing a car that is opaque to both vision and radar. The car in front might hit the UFO in dense fog, but the Tesla will not.
Additional Autopilot Release Notes:
- TACC braking max ramp rate increased and latency reduced by a factor of five
- Now controls for two cars ahead using radar echo, improving cut-out response and reaction time to otherwise-invisible heavy braking events
- Will take highway exit if indicator on (8.0) or if nav system active (8.1). Available in the United States initially
- Car offsets in lane when overtaking a slower vehicle driving close to its lane edge
- Interface alerts are much more prominent, including flashing white border on instrument panel
- Improved cut-in detection using blinker on vehicle ahead
- Reduced likelihood of overtaking in right lane in Europe
- Improved auto lane change availability
- Car will not allow reengagement of Autosteer until parked if user ignores repeated warnings
- Automatic braking will now amplify user braking in emergencies
- In manual mode, alerts driver if about to leave the road and no torque on steering wheel has been detected since Autosteer was deactivated
- With further data gathering, car will activate Autosteer to avoid collision when probability ~100%
- Curve speed adaptation now uses fleet-learned roadway curvature
- Approximately 200 small enhancements that aren't worth a bullet point
Video Bonus: Chris/KmanAuto takes the time to breakdown the entire Tesla release notes/blog and explain it all in simpler terms