On April 29, 2018, motorcyclist Yoshihiro Umeda was killed by a Tesla Model X with its Autopilot feature engaged. Exactly two years later, on April 29, 2020, Umeda’s widow and daughter filed a lawsuit alleging that the crash came as a direct result of flaws in the Autopilot system’s design. 

To be clear, Umeda was not on his motorcycle at the time of death. He had been riding with a group of other motorcyclists when one of his fellow riders had a crash with a van. As a result, the riders and the van pulled over to the side of the expressway to assess damage and exchange information. At the time of Umeda’s death, he was a pedestrian in a group of pedestrians and stationary vehicles that were pulled over on the side of the road. 

According to dashcam footage, the vehicle ahead of the Model X suddenly switched lanes to avoid the stopped vehicles and pedestrians up ahead. Meanwhile, the Model X that was following, instead of seeing what was ahead and likewise avoiding the stopped vehicles and people, sped up to get to the cruising speed indicated in its Autopilot settings. Thus, it accelerated straight into the crash scene, killing Umeda.  

The driver of the Model X had his hands on the steering wheel, but was also dozing off behind it. When confronted with such situations in the past, Tesla has been quick to blame drivers for its own system’s failures, and this lawsuit specifically states that the plaintiffs expect that’s what will happen here, too. 

Driver drowsiness and inattentiveness are certainly not new, but they’re problems that I think pretty much everyone hopes that a well-calibrated autonomous vehicle system would address. The fact that stopped vehicles and pedestrians on the side of a road is a totally common occurrence is even more troubling. This wasn’t some weird, one-in-a-million situation that Autopilot failed to account for. It’s the kind of thing that you or I might see so often that we don’t even tell anyone about it in casual conversation, unless the scene was remarkable in some way. 

A tire blowout, a stalled vehicle, a fender bender, and a million other annoying but largely non-injurious vehicular issues happen on major roadways every single day. These inevitably result in people pulling over afterward, and usually getting out, or off of, their vehicles. Pedestrians are not motorcycles or other vehicles, but if Tesla’s Autopilot programming isn’t sufficient to recognize that it needs to avoid hitting them, that’s an inexcusable problem.  

As the introduction to the official complaint reads, This case concerns the first Tesla Autopilot-related death involving a pedestrian – Mr. Yoshihiro Umeda, a Japanese citizen and 44-year-old husband and father – and Tesla’s accountability for introducing its vehicles and allowing the use of its automated driving technologies that are still in the “beta-testing” stage of development. However noble the pursuit of increasing driver safety for all may ultimately be, such pursuit cannot continue to be left unchecked and without modern regulations that adequately monitor and ensure the overall safety of automated driver assistance systems. This is especially true where the price to be paid for any technological defects and failures of these systems in real-world driving situations comes at the cost of severe harm, danger, and even death. By not holding developers, like Tesla, who are at the helm of developing such cutting-edge technologies such as Tesla’s Autopilot system, it is inevitable that without action, the first Tesla Autopilot-related death involving a pedestrian certainly will not be the last.” 

Sources: The RegisterBloomberg 

Top comments

There are no comments at the moment. Would you like to write one?
Comment!
Got a tip for us? Email: tips@rideapart.com