Autopilot isn't perfect.

David Aylor has a job with some pretty interesting perks. Recently, the Insurance Institute for Highway Safety (IIHS) manager of active safety testing has been putting miles on a Tesla Model S in an attempt to gauge the effectiveness of its Autopilot advanced driver-assistance system (ADAS). He's found that the system has some flaws, though, if you've been following coverage of a handful of high-profile crashes involving the Tesla vehicles being operated on Autopilot, you likely already knew that.

In particular, Aylor points to the system's occasional failure to properly handle road splits. It seems that there can be some confusion as to which lines to follow, and if drivers aren't paying attention, it can lead to a collision. This appears to have been the case in one of the most famous of incidents: the crash that claimed the life of Walter Huang. That incident is outlined in a recent report about autonomous vehicles. Huang's Model X, with no hands detected on the wheel, seems to have steered itself into a highway divider on Highway 101 in Mountain View, California.

Aylor also brings up a similar incident that was filmed by a driver in Chicago testing for this very situation not long after the Huang crash. In that case, the video of which we've embedded below, the car doesn't seem to know which set of lines to follow, and the driver has to intervene, braking just in front of the gore point.

These incidents, IIHS says, are evidence of the risk that partial autonomous systems can pose. Despite the fact that they found Tesla Autopilot can reduce injuries and damage claims, it is also true that it's not a perfect system and drivers need to be alert and ready to take control if it runs into trouble. And the problem isn't limited to Tesla.

The report notes that vehicles from other automakers equipped with Level 2 ADAS systems have also been involved in crashes. Those occurrences, however, haven't made headlines like those involving the Silicon Valley company. For whatever reason, none of those incidents made it into this particular report either.

The report doesn't offer much in the way of analysis, but the reason for the danger seems clear. Drivers, used to a system that works perfectly a very high percentage of the time, can be caught off guard when suddenly it experiences difficulty. Tesla has addressed this by making the system give more frequent reminders if it detects a driver's hands aren't on the wheel.

While Tesla owners using Autopilot will always have to pay attention, the system is getting better. As more improvements that will eventually become part of its "full self-driving" (FSD) feature are implemented, it remains imperative, perhaps even more so, that drivers keep their eyes on the road and hands on the wheel. It seems reasonable to assume that as situations that require driver intervention become rarer, drivers may be less prepared for them.

Source: IIHS