The Tesla Autopilot, one of the brand’s most revered features that lets the vehicle drive under supervised autonomy under certain conditions, may have been responsible for way more road accidents than previous estimates. New data reveals disturbing numbers – Tesla Autopilot was involved in 736 crashes in the US since 2019. 17 of those were fatal and 11 deaths have occurred since May 2022.
The shocking results were revealed in the Washington Post's analysis of the National Highway Traffic Safety Administration (NHTSA) data. Even though the data doesn't indicate the number of accidents Tesla’s driver assistance features may have averted, the new crash figures reveal possible pitfalls of autonomous driving, at least at the current development stage.
The report also reveals that the uptick in accidents could be due to the removal of radar – radio detection and ranging – from Teslas. In 2021, the brand announced that it will rely solely on camera-based vision processing. Every Tesla comes with 8 external cameras to map the surroundings.
Due to the recent spike in crashes, there are multiple ongoing investigations involving the technology, according to NHTSA. Meanwhile, CEO Elon Musk has repeatedly emphasized the benefits of Autopilot.
Although the NHTSA data doesn’t capture the exact crash details. In some incidents, it is unknown whether users had the Autopilot or FSD on. There are reportedly 800,000 Teslas plying on US roads that have Autopilot, and Tesla is pushing ahead with its further development and wider implementation.
Every Tesla gets standard Autopilot features such as adaptive cruise control, wherein the vehicle matches the speed of the traffic upfront and accelerates or brakes depending on the conditions. Standard equipment also includes auto steer, wherein the vehicle assists in steering on clearly marked lanes.
On top of that, Teslas can be equipped with Enhanced Autopilot capability, wherein the vehicle navigates roads autonomously, and also changes lanes on its own, among other functions. And then buyers can also purchase the Full-Self Driving (FSD) suite, wherein the vehicle can take active decisions based on traffic sign recognition.
Tesla clearly mentions on its website that the aforementioned features do not make its vehicles completely autonomous:
The currently enabled Autopilot, Enhanced Autopilot, and Full Self-Driving features require active driver supervision and do not make the vehicle autonomous. Full autonomy will be dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience
In February 2023, NHTSA announced that Tesla would recall 360,000 vehicles equipped with FSD beta due to the increased risk of crashes. Although there are conflicting reports of the Autopilot’s effectiveness. Tesla’s Vehicle Safety report for Q4 2022 revealed that 35 percent of all Autopilot crashes occur when the vehicle is rear-ended by another vehicle. Moreover, there’s one autopilot accident every 4.8 million miles driven, as per Tesla.
However, until Tesla releases the data it possesses, it would be impossible to verify its claims. For now, the NHTSA data signals that the vast majority of 807 autopilot-related accidents since 2021 involved Tesla cars.
What do you think of autonomous driving? Should Tesla incorporate other technologies like radar and lidar instead of solely relying on cameras? Leave your thoughts in the comments.