Watch Tesla AP2 Attempt Simulated Recreation Of Uber Crash

MAR 28 2018 BY DOMENICK YONEY 39

Don’t try this at home. Or anywhere.

The recent tragedy in Tempe, Arizona where an “autonomous” Uber  vehicle — a Volvo XC90 PHEV, whose native Aptiv ADAS system was allegedly disabled — struck and killed a pedestrian walking their bicycle across a street has sparked questions for at least one Tesla owner. How would his car react in a similar situation? The video you see above, as well as several similar others, ensued.

Now, what one does with their car is typically their own business, but from our perspective this “experiment” is wrong on many levels. First, the experimenters are basically playing in traffic on public roads. We probably don’t need to tell you how bad an idea this is. Even in a controlled environment, things can go wrong, and accidents and injuries can happen. Staging this stunt on a public road is irresponsible, at best.

Read Also: Watch Updated Autopilot On Tesla Model S Handle Construction Zone

White Tesla hits a box in the roadWe probably also don’t need to remind you that Autopilot is not an autonomous system. It’s an advanced driver assistance system (ADAS) with a suite of features meant to aid the driver, not replace them. It is not designed to stop for people standing in the roadway, and most likely will not. As CEO Elon Musk reportedly said during the conference call when version 8.0 software was introduced in September of 2016, “Actually, it should work for something like a moose – because it is quite a big mass, but it may not work for say a small deer. A small deer probably would not trigger braking, but a moose I think would. I’m not 100% sure of that, but I think it would trigger on a moose.”

Sure, it’s nice to know what the system will detect, and what level of “intelligence” owners can expect, but there are other, safer ways to go about it. For instance, one could check out the post published by the company that discusses how the radar, cameras, and fleet learning work together to understand the environment the car is traveling through. It’s a pretty informative read.

So, if you’ve watched the video, you can see that the system will generally not detect a floor lamp with a bit of fabric hanging off of it. If you have read about how the system functions, then you will understand why this is the case.

For its part, Tesla released a statement to Electrek back when a similar experiment was performed with an actual person standing in the road. We’ll leave it below, along with a few pertinent warnings and notifications about the safety systems that can be found in owners manual, for your edification.

Safety is a top priority at Tesla, and anyone attempting to purposefully strike another person or object with their Tesla is misusing the vehicle. It is paramount that our customers exercise safe behavior when using our vehicles, including remaining alert and ready to resume control at all times when using the car’s autonomous features, and braking to avoid a collision.

 More information on Automatic Emergency Braking:

 Model S and Model X are equipped with Automatic Emergency Braking (AEB), which is designed to engage the brakes at the last possible moment to avoid or mitigate a collision. AEB does not engage when an alternative collision avoidance strategy (e.g., driver steering) remains viable. Instead, when a collision threat is detected, forward collision warning alerts the driver to encourage them to take appropriate evasive action. AEB is a fallback safety feature that operates by design only at high levels of severity and should not be tested with live subjects.

  • Traffic-Aware Cruise Control does not eliminate the need to watch the road in front of you and to apply the brakes when needed.
  • Warning: Traffic-Aware Cruise Control is designed for your driving comfort and convenience and is not a collision warning or avoidance system. It is your responsibility to stay alert, drive safely, and be in control of the vehicle at all times. Never depend on Traffic-Aware Cruise Control to adequately slow down Model S. Always watch the road in front of you and be prepared to take corrective action at all times. Failure to do so can result in serious injury or death.
  • Warning: Forward Collision Warning is for guidance purposes only and is not a substitute for attentive driving and sound judgment. Keep your eyes on the road when driving and never depend on Forward Collision Warning to warn you of a potential collision…
  • Warning: Automatic Emergency Braking is designed to support the driver in emergency situations only. Several factors can affect the performance of Automatic Emergency Braking, causing either no braking or inappropriate or untimely braking. It is the driver’s responsibility to drive safely and remain in control of the vehicle at all times. Never depend on Automatic Emergency Braking to avoid or reduce the impact of a collision.
  • Warning: Automatic Emergency Braking is not designed to prevent a collision. At best, it can minimize the impact of a frontal collision by attempting to reduce your driving speed. Depending on Automatic Emergency Braking to avoid a collision can result in serious injury or death.
  • Warning: Automatic Emergency Braking is designed to reduce the severity of an impact. It is not designed to avoid a collision.

Source: YouTube, Tesla, Electrek

Categories: Tesla, Videos

Tags:

Leave a Reply

39 Comments on "Watch Tesla AP2 Attempt Simulated Recreation Of Uber Crash"

newest oldest most voted

If you could make money sitting at home on your azz publishing “articles” online, would you not? I mean, yeah, the entire internet sucks now, and is likely irrevocably broken, but c’mon, people gotta to get paid.

This “recreation” is scientifically flawed.

I think these systems use infrared (if not they should), so they should be able to see person, whereas it might not easily see an object that is probably at ambient temp.

Also, it is static, and the Uber accident involved a moving person.

The Chevy Bolt AV by GM slowed down for a raccoon for goodness sake. Something serious went wrong to miss a person. And Musk thinks his system might stop for a moose? Does not instill confidence.

(⌐■_■) Trollnonymous

“And Musk thinks his system might stop for a moose? Does not instill confidence.”

So you’re comparing AP with an Autonomous???
Not even close!! AP is definitely NOT Autonomous and nobody has ever said it was.

And yet, it’s Musk who was claiming a Tesla would make a cross-country drive without human assistance in 2017. And who is still claiming this will happen in 2018 or early 2019, last I saw.

And it’s Tesla who calls it “Autopilot” which is HIGHLY misleading. And it’s Tesla who has refused to rename it at the call of Consumer Reports, in the name of public safety.

Less hype and more focus on real world results would be a far better thing for Tesla’s credibility over time.

So maybe a little less faux outrage over the comparison to autonomous is in order, since it’s Musk who, by implication and showmanship who has been doing exactly that.

There is no ACC or TACC out there that will detect and stop for a stationary object. They are specifically programmed to ignore in order to mitigate false positives.

Look, another new username troll to join with the Neo-luddites.

Poor Get FUD and his eternal world of paranoia.

Yea. You tell ’em! Stock shorts don’t really exist.

Poor mental MadBro (a fudster troll know to frequently carpet-bomb Tesla threads with his FUD) once again lamely trying to “flip the script” .

Meanwhile, many new usernames keep popping up and repeating the same tired Seeking Shorters serial anti-Tesla FUD that MadBro and other trolls are so fond of here.

The answer to that is simple, it has to know when to stop and when not to stop. There are many cases where poor automatic braking systems stop of a floating bag and end up in a crash from behind.

Something like a raccoon should not trigger the system. Neither should the lamp in the video. The object needs to be big enough and positioned like a human.

> The object needs to be big enough and positioned like a human.

Hilarious. Keeping on changing the object till AP discovers it, and then call it a success.

Really twisted mentality, if people think it’s acceptable that the system cannot see an object like this on the road, and stop.

Governments have to get involved, and mandate a series of tests, that cars have to pass, before they are certified for self driving.

You need a license to drive, and you have to pass a test, but right now, no car company has to met any qualifications for putting self driving vehicles on the road, and so far it has resulted in the death of person.

> big enough and positioned like a human

Like a small skinny child? Really? The forms a human can take is so varied.

A car or a wall or a firetruck or a big orange street cleaner, etc. are likely at ambient temp, and yet we’d want a Tesla not to crash into them, right?

Or how about any other object in the road?

Until these systems are highly reliable, they shouldn’t be in the hands of customers (or paid minders who pay no attention) being beta tested on public roads.

(They should be tested on private tracks or roads until they are safe enough to at least see significant objects and attempt to stop or avoid, very consistently).

From what I remember on the firetruck incident, the driver was following another truck. The truck he was following behind was blocking the view and dodged at the last second. So even though he was paying attention, he couldn’t dodge in time even if he was driving. Now the issue there may have been tailgating, but in that scenario I don’t think there could have been anything done without V2V. The issue was human error, not the AP.

Remember, the AP can do stuff but that doesn’t mean it can perform magic and stop a car from over 60mph to 0 in a fraction of a second.

There is nothing wrong with driver assist systems

or fully self driving cars which follow proper procedures.

So far, both have proven to be more reliable than humans.

Who has fully self driving cars?

That thing WAS A POLE. A LAMP POLE. LAME TEST.

(⌐■_■) Trollnonymous

Now THIS is something Mythbusters can test!!!!!!

Someone let them borrow your Tesla.
٩(- ̮̮̃-̃)۶

(⌐■_■) Trollnonymous

That was meant for scott.

Says someone who clicks on it!

This is all very disappointing. I thought these automatic emergency braking systems actually worked. Apparently I was wrong. No excuses, if they are serious about designing such a system, it should stop no matter what is placed in front of it.

Autopilot works primarily via radar, so it sees solid objects like cars and trucks, no problem. A stick with a shirt hanging off of it is a little harder.

The point is, though, it’s not yet designed to detect humans.

> The point is, though, it’s not yet designed to detect humans.

That’s fair, and has to be emphasized to those that enable AP, so that they remain vigilante with it on, which I know Tesla already states.

Those 2 idiots are competing for a Darwin Award.

Another Euro point of view

+1000

Why?

A floor lamp is not a person. Doing that on public roads is insane!! There are likely laws against it.

Who in hell decided that we car drivers need self driving cars that automatically perform our driving?! I’ve been driving for over 50yrs & I never ever crashed my car into anything or anyone. It’s not miraculous but it’s simply PAYING ATTENTION!! Stop talking & texting on the damn phone & doing other stupid distracting stuff when driving a car & you won’t need no damn autopilot or any other gizmo to drive safe. Come on pp turn off your phones & put your dog in the back seat (not on your lap) stop shaving & putting on makeup at 50mph on surface streets & don’t drink & smoke dope.

Another Euro point of view

Do you often drive in traffic jams ? I find such a system useful in traffic jams or long distance highway driving. For the rest I don’t need it.

In Carl’s world, nothing should change.

A lot of people in our society cannot or don’t want to drive. Too old, cannot afford a car, disability, etc.

Companies like Waymo, with self driving vehicles, can offer affordable door to door means of transport to those people, that much cheaper than taxis, and more readily available.

Looks to me like the car performed exactly as promised: it allowed the HUMAN to take corrective action to avoid the collision. Better still, the headlights made Mr. Lampy visible in time to allow the HUMAN to do his job, which is to act as the commanding PILOT of the car.

I counted about a dozen driving offences in that 5 minute clip. Police should track them down and take away their driving licences. They should also be advised to go and see a shrink.

Hazard warning lights on might have a influence in this test…….

So many differences that this “simulation” is not even worth looking at.
– WAY slower speed
– Two-lane, two-way roadway with parking instead of one-way multilane roadway no parking
– “Pedestrian” is not moving

But it failed. You comments would make sense, if the test succeeded.

A person died, yet this couple seems to be having a lot of fun running this experiment…

So what. If people ran these type of tests in the first place, a person would not be dead.

These people are bring awareness to Tesla’s owners that you have to be vigilant when AP is on, even though Tesla states that, this is just extra proof of the limit of the AP system right now.

TheWay said:

“Something like a raccoon should not trigger the system.”

A human baby crawling in the road may well be smaller than a racoon. I’d say fully autonomous cars need to avoid colliding with anything larger than, say, an average sized cat.

“The object needs to be big enough and positioned like a human.”

Oh, good luck with that! I can just see software programmers trying to include cases of:

1. A child on a tricycle

2. An amputee walking on crutches

3. Someone who has fallen down unconscious in the road

4. Someone wearing a costume which changes the outlines of his/her body

5. A baby inside a stroller

6. Recognizing that a manikin isn’t a human

Yeah, good luck with trying to design software to reliably recognize what is, or isn’t, a human.

Nope, speaking as a computer programmer, it is only reasonable to avoid the entire question of “What does a human being look like?”, and simply require the car to avoid colliding with anything larger than a cat.

As the article explains, we are very far away from cars being designed to detect and stop for, or swerve away from, human-sized objects in the roadway.

Roger Geyer said: “A car or a wall or a firetruck or a big orange street cleaner, etc. are likely at ambient temp, and yet we’d want a Tesla not to crash into them, right?” Right. Trying to use infrared cameras would be foolish. It would also be a complex and expensive system, since the infrared camera itself has to be cooled down to a lower temperature than ambient temperature. What is needed is active, 360° scanning using high-resolution radar and/or lidar. The reason that Musk was talking about something the size of a moose, is because current sensors are low-resolution Doppler radars. Those have some utility in detecting large objects which are moving in relation to their surroundings, but nothing else. That is ridiculously far away from the amount of detail that a fully autonomous car will need, to be able to build up a reliable, real-time, 3D “map” of its surroundings. That sort of detailed map, or SLAM*, is what’s going to be needed to prevent autonomous cars from hitting things — including pedestrians. We’re a long, long way from semi-autonomous cars having a good SLAM system. To really understand just how poorly the limited-function radars in current… Read more »