Is Tesla Autopilot technology the real problem or is it driver complacency, which is accelerated by semi-autonomous systems?

Tesla has been in the news a few times lately for its Autopilot system's failure to "see" and stop for stationary cars ... well fire trucks, actually. Some assert that this is a problem specific to Tesla's semi-autonomous driving system and automatic emergency braking features. Others argue that the system - like that of just about any other automaker's like technology -  is not designed to stop for stationary objects. Finally, others believe that the semi-autonomous tech gives drivers a false sense of security.

If the car you're following suddenly veers out of your lane, chances are, most braking systems are not going to stop if a parked car is revealed immediately in your path. This may also true of a human driver. While a computer may have an exponentially quicker response time, these systems just aren't ready to handle this type of situation.

One could argue that a human driver may have noticed the stopped car prior to the lead car diverting. One could also argue that if Autopilot (or any automatic system) wasn't engaged, the driver may have been inherently more attentive.

According to the study, the problem with Tesla Autopilot - and virtually all current semi-autonomous driving technology - is that people become too comfortable with it. The test shown above is fairly basic. The test car appropriately follows the lead car, the lead car veers out of the lane to avoid a stopped car and the Tesla slams into the stopped vehicle.

Sadly, what the study fails to clarify is details related to the technology itself. How does it work? Why doesn't it "see" the stopped vehicle? How does Tesla Autopilot compare to other similar systems on the market? Instead, the test is merely shown to reiterate the fact that this technology is not to be trusted and that drivers must remain aware and in control of their vehicles.

Jalopnik shared reporter Edward Neidermeyer's similar assessment of the situation:

Jalopnik also reached out to Tesla, and, of course, the automaker agrees with the study. The company spokesperson went so far as to say that Tesla users are aware that if you don't pay attention, you will crash. Apparently, not all Tesla owners got the memo, but the point is that they are made aware repeatedly. The spokesperson concluded (via Jalopnik):

Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents and the issues described by Thatcham won’t be a problem for drivers using Autopilot correctly.

 

The feedback that we get from our customers shows that they have a very clear understanding of what Autopilot is, how to properly use it, and what features it consists of. When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times. This is designed to prevent driver misuse, and is among the strongest driver-misuse safeguards of any kind on the road today.

Video Description via Thatcham Research on YouTube:

Leading car safety expert Matthew Avery from Thatcham Research demonstrates what can happen when a driver becomes convinced that a current road car is capable of driving autonomously.

Let us know what you think in the comment section below or by starting a new thread on our Forum.

Source: Jalopnik