Tesla's vehicles have reportedly crashed into parked emergency vehicles on several occasions. In at least some of those instances, it appears the Tesla vehicles had Autopilot engaged. The automaker is currently under investigation by NHTSA to figure out what may be going wrong. In the video above, we see how a Tesla with Full Self-Driving (FSD) Beta Version 10 deals with a similar situation.
The toughest task for self-driving cars is related to edge cases. Some folks have asked, "What is an edge case?" Basically, it's a situation that's not typical of everyday driving, such that it could confuse advanced driver-assist systems since there's a good chance the technology hasn't been tested in the exact situation, and thus, hasn't learned how to handle it.
According to some beta testers, Tesla's FSD Beta Version 10 can drive the car with no interventions at times. However, at other times, the driver needs to intervene. This is why beta testers, as well as Tesla owners using Autopilot, must remain aware and ready to take control at a moment's notice.
Thankfully, so far, it seems there have been no accidents related to beta testers testing Tesla's FSD technology on public roads. However, it seems CEO Elon Musk is a bit concerned this may change going forward. As Tesla launches the tech to a much larger group of owners, the chances of an issue will become much more worrisome. For this reason, drivers will first need to prove they're worthy before getting access to FSD.
Back to the short video clip above. It shows exactly why you can't trust FSD Beta, and you have to remain in control at all times. Weirdly, the technology seems to completely ignore the fact that it's about to drive into the back of a stopped bus. The driver has to take over. Then, the technology avoids the stopped tow truck without issue. Talk about inconsistency. Let us know your thoughts in the comment section below.