Test Shows Why A Tesla On Autopilot Might Crash Into Parked Car

JUN 18 2018 BY STEVEN LOVEDAY 80

Is Tesla Autopilot technology the real problem or is it driver complacency, which is accelerated by semi-autonomous systems?

Tesla has been in the news a few times lately for its Autopilot system’s failure to “see” and stop for stationary cars … well fire trucks, actually. Some assert that this is a problem specific to Tesla’s semi-autonomous driving system and automatic emergency braking features. Others argue that the system – like that of just about any other automaker’s like technology –  is not designed to stop for stationary objects. Finally, others believe that the semi-autonomous tech gives drivers a false sense of security.

If the car you’re following suddenly veers out of your lane, chances are, most braking systems are not going to stop if a parked car is revealed immediately in your path. This may also true of a human driver. While a computer may have an exponentially quicker response time, these systems just aren’t ready to handle this type of situation.

One could argue that a human driver may have noticed the stopped car prior to the lead car diverting. One could also argue that if Autopilot (or any automatic system) wasn’t engaged, the driver may have been inherently more attentive.

According to the study, the problem with Tesla Autopilot – and virtually all current semi-autonomous driving technology – is that people become too comfortable with it. The test shown above is fairly basic. The test car appropriately follows the lead car, the lead car veers out of the lane to avoid a stopped car and the Tesla slams into the stopped vehicle.

Sadly, what the study fails to clarify is details related to the technology itself. How does it work? Why doesn’t it “see” the stopped vehicle? How does Tesla Autopilot compare to other similar systems on the market? Instead, the test is merely shown to reiterate the fact that this technology is not to be trusted and that drivers must remain aware and in control of their vehicles.

Jalopnik shared reporter Edward Neidermeyer’s similar assessment of the situation:

Jalopnik also reached out to Tesla, and, of course, the automaker agrees with the study. The company spokesperson went so far as to say that Tesla users are aware that if you don’t pay attention, you will crash. Apparently, not all Tesla owners got the memo, but the point is that they are made aware repeatedly. The spokesperson concluded (via Jalopnik):

Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents and the issues described by Thatcham won’t be a problem for drivers using Autopilot correctly.

 

The feedback that we get from our customers shows that they have a very clear understanding of what Autopilot is, how to properly use it, and what features it consists of. When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times. This is designed to prevent driver misuse, and is among the strongest driver-misuse safeguards of any kind on the road today.

Video Description via Thatcham Research on YouTube:

Leading car safety expert Matthew Avery from Thatcham Research demonstrates what can happen when a driver becomes convinced that a current road car is capable of driving autonomously.

Let us know what you think in the comment section below or by starting a new thread on our Forum.

Source: Jalopnik

Categories: Crashed EVs, Tesla, Videos

Tags: , , ,

Leave a Reply

80 Comments on "Test Shows Why A Tesla On Autopilot Might Crash Into Parked Car"

newest oldest most voted
Al

If Tesla agrees with the findings, then it should stop advertising and selling “Full Self Driving” option!

Mark.ca

“Full Self Driving”
What’s up with the quotes man? You implement them when you use others words. When did Tesla say any of their cars was full autonomous?

Pushmi-Pullyu

That’s not what he said. Tesla is indeed selling a “Full Self Driving” option, altho it’s not claiming that it can be implemented yet. And he is right to question this, because I don’t think the hardware installed in current Tesla cars is adequate for Full Self Driving. I think that will require the addition of lidar scanners, or high-res radar arrays, or both.

scottf200

Do you realize that the “Full Self Driving” option is not active yet? Tesla site: “It is not possible to know exactly when each element of the functionality described above will be available”. Cheaper if you buy it in advanced. Looking forward to a few subset of the features of it being activated this Fall. Perhaps traffic lights. With the latest firmware stopping for cars that are *already* stopped at stoplights has dramatically gotten better. Same with vehicles (pedestrians) crossing your path at 90 degrees or so.

Scott Franco

“If Tesla agrees with the findings, then it should stop advertising and selling “Full Self Driving” option!”

1. This “test” is an unscientific test performed by a TV production outfit.
2. Tesla NEVER sold a “Full Self Driving” option, nor made it available to the public.
3. You’re an idiot, but I suppose that goes without saying.

Another Euro point of view

-1 for ad hominem attack.

Will

No, Thatcham Research are a non-profit organisation that specialise in vehicle safety, security and crash protection. When I was in the market for security tagging for my motorcycle the police even recommended Thatcham Approved products which carry great weight in the UK for automotive security.

They are not a TV production outfit so please refrain from making up gibberish about that which you haven’t the faintest clue. It’s ironic you then call someone an idiot in that same reply as you spout this tripe.

Pushmi-Pullyu

“Tesla NEVER sold a “Full Self Driving” option, nor made it available to the public.”

This is factually incorrect. Tesla does sell a “Full Self Driving” option for $3000. Tesla is not claiming it can be implemented at present. Buying the option is basically paying now for a promised development later.

https://forums.tesla.com/forum/forums/full-self-driving-option-what-does-it-do-now

wavelet
Scott, it’s not like you to be unaware of easily-checkable facts. Like Pushmi says above me, Tesla is most certainly selling FSD as an extra-cost option, with the caution that it has no idea when it’ll be available. Tesla is also on record saying that it will upgrade the HW needed if necessary. However: 1. There’s no reason to think full L5 autonomy will be available in any developed jurisdiction on the planet Eartch anytime in the next 20-30 years. Systems that can do it pretty much need to solve the “Hard full AI” problem, and as a former research in the field in academia, I can state with confidence noone has any idea what will be needed to do so from the SW PoV. And simply using neural networks won’t do it. 2. The consensus view of everyone working on autonomous driving outside Tesla (which is doing little actual work) is that it required LIDAR sensors, which currently cost multi-$10K per car. Of course, there are several companies working on reducing costs, but there’s no reason to think it’ll get to the multi-$100/car point anytime in the next decade. 3. As a result, from my PoV, Tesla’s selling the… Read more »
Will

Tesla have never advertised their vehicles as fully self driving – and particularly not with the quotes as you suggest. Making stuff up as you go along doesn’t make you right.

S'toon

You are aware that the full self driving option is a future capability, not something that’s currently implemented, aren’t you?

Rafael Sabatini

https:// youtu . be/JT5-Z2MswgU?t=1m45s

Yep, everybody knows that

Unplugged

You are conflating two different options. The Autopilot option is currently enabled and ready for use, while the “Full Self Driving” option is not enabled. Perhaps you have a distorted vision of what “advertising” means. Selling a product for future use is not “advertising. So Tesla is not “advertising” the Full Self Driving option. It will eventually be available, but is NOT yet enabled.

wavelet

What do you mean? Tesla is offering people to buy FSD as an extra-cost option on top of AP, although it has no idea when the SW will be working or when legal jurisdictions will allow it (I’m certain no developed country will allow it in the next 20 years).

dan

It’s too bad that Tesla is not incorporating research from a vast field of biologically inspired computer vision that is many decades old. Humans have built in reflexes like the looming reflex that automatically makes us close our eyes, look away and crouch when an object suddenly increases in size (for example, a fist coming towards our face). If the body had to wait for the higher level brain to figure out what it was, there is a good chance that our brain will already be eaten or turned to pulp. Most human drivers’ brains have found ways to link those reflexes to swerve/brake circuits – we will likely do that while driving even before our higher level cortex recognizes the shape as a parked car.

While such reflexes are not always accurate, they err towards safety so that the animal can live another day. That is something that the AI researchers with their heritage in statistical thinking don’t seem to get. This is one field where the rare, one off statistical outlier scenario cannot be ignored.

Scott Franco

Tesla does not use computer vision (at this time) for EAP.

Rafael Sabatini

And, that’s the question: “Why not?”

Because if you want to be better than a human driver you need to handle this scenario. As thousands of humans successfully do every day.

Pushmi-Pullyu

First of all, what is described above as “biologically inspired computer vision that is many decades old” is hopelessly inadequate for the task. Optically-based object recognition software has indeed been in development for decades, and the lack of dependability shows how inadequate current hardware and software is, as compared to the highly complex and highly sophisticated “wetware” in the human brain’s visual cortex. It’s much too unreliable for human lives to depend on its operation.

Secondly, it would be a mistake for developers of self-driving cars to attempt to slavishly re-create how humans see and interact with the environment. For example, our eyes can’t see in the dark — and neither can the video cameras installed in cars. Why should self-driving cars be limited to only being able to “see” in well-lit conditions, when lidar and high-res radar are available, are much more accurate and much simpler for “painting” a picture of the environment around the car, and don’t need daylight to “see”?

dan

Concepts like looming and optic flow are just as applicable in Lidar, Radar, or FLIR. The only difference is the spectrum of incoming radiation and the algorithms used for signal processing are nearly identical.

Also, just because a field has many decades of work behind it doesn’t mean it is dead. It means that cutting edge research is building on preexisting advances.

Another Euro point of view

The thing is this kind of videos should have come from Tesla long time ago as part of an effort to make Tesla owners aware of this system limits and not over rely on it the wrong situations. The fact that an outside sources has to do this job is sad and worrying.

Also, the problem is a system that works in 99% of situations and can kill you in the remaining 1%. Asking people to keep full attention at all time in such circumstances is unrealistic and thus hypocritical.

bro1999

“Also, the problem is a system that works in 99% of situations and can kill you in the remaining 1%. ”
Yep, which makes Tesla’s (and their diehards) constant harping about the NHTSA saying cars with AP were 40% safer than non-AP cars was total crap. Luckily the NHTSA finally came out with a statement backing off the 40% claim.
If a car’s tech reduces accidents to some degree, but potentially INCREASES the odds of a fatal accident, is it really safer than a car without said tech?
Tesla said they would be publishing quarterly AP reports. Hopefully they don’t just include info that sheds AP in the best light.

Rafael Sabatini

It’s weird that Tesla continue to say that, too.
A bunch of cars, with far more cars in the installed base in the US, (this is according to ACTUAL IIHS data) have had ZERO fatalities in the time period wherein Tesla have had over a dozen.

So, Tesla is easily provably NOT safer than, say, an Audi A4 or a Lexus RX350.
Neither of which have AutoPilot.

u_serious?

Tesla has dumber drivers.

Pushmi-Pullyu

Hey, nice use of the “bait and switch” false argument, Mr. Troll. Statistics show an overall lower accident rate for using Autopilot+AutoSteer, and you’re focusing only on the very much lower fatality rate.

And of course you can find models of cars which are made in such small numbers that no fatal accident has yet occurred in them. This is an example of why statistics based on small sample sizes are not reliable.

Will

While I don’t disagree with you, I think Americans put too much weight in the NHTSA. It’s nowhere near as in-depth as some other safety/testing bodies like Euro NCAP. Way, way less thorough.

Pushmi-Pullyu

“…Tesla’s (and their diehards) constant harping about the NHTSA saying cars with AP were 40% safer than non-AP cars was total crap. Luckily the NHTSA finally came out with a statement backing off the 40% claim.”

It’s your statement which is “total crap”.

The statistics which the NHTSA reported actually show that Tesla Autopilot + AutoSteer reduces the accident rate in Tesla cars by significantly more than 40%.

There has been an absolute failure of critical thinking by several reporters writing on the subject, all repeating the same mistake. And of course serial Tesla Hater cultists like you keep gleefully repeating that brain-dead failed analysis!

If you apply logic, it’s not that difficult to follow: If, as the NHTSA reported, merely installing AutoSteer in Tesla cars reduced the accident rate of Tesla cars by ~40%, whether or not AutoSteer was actually turned on at the time, then AutoSteer must actually reduce the accident rate by significantly more than 40% when it is turned on, to counterbalance those accidents which occur with AutoSteer turned off!

http://bgr.com/2017/01/19/tesla-autopilot-crash-safety-statistics-report-nhtsa/

S'toon

The lady who crashed her car into the back of the firetruck was texting at the time. What is unreasonable about asking people to not text and drive? Hello? They have ads about it on TV.

Pushmi-Pullyu

“…the problem is a system that works in 99% of situations and can kill you in the remaining 1%.”

Again: No, that’s not at all the problem. It’s not a problem of the systems malfunctioning. The problem is that current semi-self-driving systems, such as Cadillac Super Cruise and Tesla Autopilot/AutoSteer, are not designed to do things that people think they are. Specifically, they are not designed to do things like detecting large stationary obstacles in the path of a semi-self-driving car.

But I absolutely do agree that Tesla (and Cadillac, and other companies) should make a much greater effort to educate the public about the limitations of such systems.

https://arstechnica.com/cars/2018/06/why-emergency-braking-systems-sometimes-hit-parked-cars-and-lane-dividers/

Mark.ca

“this kind of videos should have come from Tesla long time ago”
Absolutely!
They would have saved themselves the headache of having to endlessly explain themselves. They never had full autonomy and never claimed to but they did little to address the confusion among the general public.

Jason

I doubt anything a company does can negate all those youtube videos showing people driving their Tesla inappropriately. If you spent $80k+ on your car and didn’t even check out what it can do properly, then maybe that is survival of the fittest at work?

DL

I would have sworn that a couple of years ago there was a series of articles demonstrating how Tesla’s driving systems could see “around and through” the vehicles in front of you and properly react to the actions of cars in front of them that you couldn’t directly see, but I’ll be damned if I can find those articles again. Am I remembering that wrong? It seemed pretty amazing at the time. In fact, I seem to recall someone claiming that the system detected the exact condition being described here and DID stop the car in time to avoid the accident.

Another Euro point of view

I don’t remember this exactly but indeed at that time the message going through was certainly not emphasizing the system limitations and that arguably caused deaths. Now regarding the objects hit while on autopilot (3 trucks and one concrete barrier if not mistaken), Tesla got very lucky that those were not compact cars with the back seat full of kids under 10. If it would have been the case, the absence of loud and clear communication regarding the system limitations could have brought the company in a very bad spot.

Pushmi-Pullyu

Ummm… Tesla not making compact cars with inadequate rear seat air bags and inadequate front crumple zones is about as far away from “got very lucky” as it’s possible to get! Superior design and build, not “luck”, is what has made the fatality rate in Tesla cars so very low.

Brett

He was referring to the objects that the Tesla ran into.

Jason

Good point. That is a feature of the radar. Haven’t checked, but I think it was an AP2.0 feature. When these accidents are reported, I can’t recall they say what version AP was being used, maybe that is also a factor.

menorman

I remember seeing them too and just today, another site published a story about what Autopilot “sees”.

https://electrek.co/2018/06/18/what-tesla-autopilot-see-understand/

Will

I’ve seen some Tesla autopilot videos posted by owners where the car does indeed see ‘through’ vehicles in front. It’s on YouTube, one of the clips. Can’t remember the name. Autopilot complilation or something.

Pushmi-Pullyu

There were reports of Tesla bragging that it could bounce radar beams underneath a car and detect another beyond that; see link below.

However, please note that all cars involved were moving. As I’ve said in earlier comments, the problem is that people don’t understand that automatic braking systems, including Tesla’s, are not designed to react to stationary objects. Not even stopped cars (or fire trucks) in the lane in front of the semi-self-driving car.

The test shown in the video could be a “teachable moment”, and it’s too bad that so few articles — unfortunately including this one — don’t address the actual underlying problem.

https://electrek.co/2016/09/11/elon-musk-autopilot-update-can-now-sees-ahead-of-the-car-in-front-of-you/

DL
Thank you, that lead me to the article I was thinking of, partially quoted below. You’ll note that he does specifically refer to a stationary object (the UFO that lands on the highway) and how the radar would be able to see that UFO by looking under and around the car in front of you. It would be interesting to know if this “learned braking” actually occurred in any of the newsworthy crash events that have happened. From Musks blog post 9/11/2016″ “…This is where fleet learning comes in handy. Initially, the vehicle fleet will take no action except to note the position of road signs, bridges and other stationary objects, mapping the world according to radar. The car computer will then silently compare when it would have braked to the driver action and upload that to the Tesla database. If several cars drive safely past a given radar object, whether Autopilot is turned on or off, then that object is added to the geocoded whitelist. When the data shows that false braking events would be rare, the car will begin mild braking using radar, even if the camera doesn’t notice the object ahead. As the system confidence level rises,… Read more »
darth

Based on the video, it looks like the AP tries to brake as soon as the stopped car is visible to it, which seems like a good thing. They just need to add an “avoid if can’t stop in time and there is space to the right or left to move into” condition.

The fact that the systems are good enough that we are testing edge cases like this shows how far this technology has come in the past 10 years.

dan

I wouldn’t call an obstacle on the road an edge case. That’s the very definition of driving.

Rafael Sabatini

Asimov’s First Law of Robotic Driving: “Thou Shall Not Plunge Headlong Into a Stationary Object Directly In Your Path”

Yeah, it’s why the good lord gave you four-wheel disc brakes. And a pedal. To stop.

Will

The car is almost hardly braking. At that speed it could come to a complete stop in the 2 or so seconds it detects the obstruction. That’s an emergency stop situation but it appeared to be braking quite gently.

Pushmi-Pullyu

That’s my assessment, too. The Tesla car isn’t actually trying to stop, because just like every (or almost every) production car with ABS, it’s not designed to detect or react to stationary objects. If it’s braking gently, it’s probably because the car in front of it was braking before it suddenly changed lanes.

John

While I understand the critics who believe Tesla should be more accountable for accidents that occur on Autopilot, I believe there is a shared responsibility for operators of their vehicles, too. Tesla should do a better job understanding their own product, and better educate folks using that product. And drivers need to pay attention when using Autopilot. And I’ve heard the argument that “why have Autopilot if you have to constantly pay attention?” I can respect that question, to which I’d say, whether you drive in Autopilot or not you have to pay attention. Driving using Autopilot requires a different kind of attention, and frankly, its not the kind that many folks are used to or necessarily even understand.

Jason

Do people use Cruise Control (not the adaptive kind)? So do you turn your brain off? You can drive down the freeway at 60mph on cruise, but as soon as anything gets in the way, or conditions change, you take control. AP should be basically the same situation. Sure it will get used to, but you’re still paying attention.
Not sure what it’s like other places, but usually if there is a vehicle stopped in the lane there are a few other indications before the car in front suddenly changes lanes. Not all the time, but most of the time.

vvk

This test makes my blood boil. Would a human driver be able to stop in this situation? I REALLY doubt it. People get into this exact type of accident ALL THE TIME. Just watch dashcam videos on youtube.

Another Euro point of view

I kind of agree with you but then again 1/ safe distance should be kept between cars 2/ this video did not address the most problematic aspect of the autopilot system which seem to be the false positive issue that, as I understood it, was responsible for the Tesla’s crashing against trucks (system considering that false positive is more dangerous than disregarding stationary objects altogether).
What I don’t understand is that, while the best experts around all agree that full self driving cars are far from ready yet and won’t be for at least another 10 years, Elon Musk still insists to declare every now and then that full self driving is just around the corner. Being an incorrigible optimist and attention addict is ok as long as it doesn’t kill people.

Pushmi-Pullyu

It’s the same attitude from Musk which has achieved SpaceX booster rockets which can land safely back on their tail after launch. It’s also the same attitude which has steered Tesla from a company that nearly all financial analysts said had very little chance of surviving, into the fastest-growing auto maker outside China. Those are just two of the things that Elon Musk has accomplished which many people said was impossible, or at least couldn’t be practically achieved. Elon Musk is used to proving he can do what others declare is impossible!

Now in this particular case, I do agree with you: Fully developed self-driving cars are not going to arrive in 5 years or less. I think Elon has been far too optimistic about that. But technological breakthroughs are often made by people who either don’t fully understand the difficulties involved, or else do understand them but refuse to let that stop them!

Who knows? Maybe Elon will prove us both wrong. I hope so! 🙂

Scott Franco

Yes, its a “discovered check” type thing. I have almost been had in a similar situation. However, observing the rule that you never follow more closely than a length at which you are prepared to come to a full stop will fix this. I also tend to prefer the leftmost lane simply because it gives you an extra “out” in issues like this. You can dive onto the shoulder if need be.

Rafael Sabatini

Humans safely negotiate this scenario thousands of times a day. And, they certainly don’t ACCELERATE into the crash, like Tesla will do as it resumes back up to cruise control speed.

The car could be declared LEGALLY BLIND…

Will

A human driver who is ready for it could probably stop in this situation, yes. The time when the obstruction became visible to the time of impact was just over 2 seconds – and in that time the car made almost no effort to brake. There are videos online with autopilot braking and preventing an accident with FAR less time to react than in this video experiment.

I mean, comments like yours annoy me because you keep comparing the car to what a human driver is capable of. The whole bloody idea we invented automated braking systems is PRECISELY because a computer can EASILY react faster than a human can. A person has what, at BEST, 0.5s thinking delay and reaction, where a computer can react in what, 0.001s or less?

Would a human driver be able to stop in this situation…. sheesh. We aren’t adding emergency braking to cars so that they can stop in the same time a human can. We’re adding it so they can stop QUICKER.

Rafael Sabatini

Agree. The statistics here (and Waymo is working to this, as is GM) say you need to be at about 8-sigma on anomaly rates. Humans are about 7.

But, to be convincing any autonomous system needs to be another order of magnitude better statistically. The reason people freak out at this bug is the obvious– generally humans will react and brake here. 2 seconds is an eternity. So for the Tesla AP to BOTH miss the reaction time window AND actually accelerate toward the crash is a bad combo.

vvk

> A human driver who is ready for it could probably stop in this situation, yes.

100% of the time?

> I mean, comments like yours annoy me because you keep comparing the car to
> what a human driver is capable of.

I am not even looking for that from autopilot. When I use autopilot in my Tesla, I treat it as a dumb system that makes it possible for me to focus while I watch the road. Routine tasks that require no intelligence like keeping equal distance between the lanes and a set distance from the car in front are handled by the autopilot. It pisses me off that I paid a lot of money for my car and you people are trying to take it away from me.

Edinho

So, why they don’t promote, in the autopilot, a vehicle slowdown when the vehicle ahead change lanes?

Pushmi-Pullyu

If semi-self-driving cars were programmed to respond to every situation which might possibly be a harbinger of an accident, then they would constantly be slowing drastically or stopping, and people would just shut off the system in frustration.

Remember all the jokes about stopping for a taco truck?

Anyway, that’s not at all the problem here. The problem is that most people — unfortunately including the writer of this article — don’t understand that automatic braking systems — even including Tesla’s — are not designed to detect or react to stationary obstacles. Stationary obstacles including vehicles stopped in the lane the car is driving in!

https://arstechnica.com/cars/2018/06/why-emergency-braking-systems-sometimes-hit-parked-cars-and-lane-dividers/

CU

How is Automatic Emergency Braking tested by test centers and institutes?

This kind of automatic emergency braking implementation is major failure!

Are other similar implementions this bad?

A well working AEB is what I would priotize in a new car and not more an less wothless gimmicks.
Luckely, I have not crashed int the rear of a stand atill car on a highway, but it has been close many, many times!

Edinho

I see. But I think that, if the car could slow a bit when the car ahead change lanes, it would call the driver’s atention and give him more time to brake.

Jason

Makes sense, or at least a delay before accelerating back to the set speed.
Now, what I want to see are AP videos showing what the system is sensing along the sides of the road. People say it can’t sense stopped vehicles, but it sure senses vehicles around it, so there must be some delta speed difference that it can sense.

EVShopper

Apparently Tesla doesn’t watch you tube. Lots of examples of people using their systems incorrectly.

viriato

Tesla plays with the amiguous meaning of “autopilot”, I don’t like that because will be blurred, but they advert seriously that the system doesn’t allow the car be driven by itself without human supervision, so every accident is a bad use of the system, and a driver’s fault.

We are in autonomous driving level 2…. 3 in the most advanced cars. This is not enough for let the car go without human supervision, it’s needed level 4 or 5 for that.

USA autorities, must obligate the makers to fit a system like in Europe, that if you take off your hands from the wheeldrive for a seconds, the car alerts the driver and if he doesn’t put the hands on, disconect the system and goes on manual driving.

Pushmi-Pullyu

Ridiculous. That would just mean that everybody would shut off Tesla AutoSteer, and it would no longer be reducing the accident rate by significantly more than 40%, and would no longer be saving lives.

“The thing to keep in mind is that self-driving cars don’t have to be perfect to change the world. They just have to be better than human beings.” — Deepak Ahuja, CFO of Tesla Inc.

Scott Franco

This test is flawed. I looked through the material online and saw no mention of the issue. They didn’t show verification that the Tesla would stop for the “faux” car without the other car suddenly moving out of the way. I.E., if the faux car is stopped in the lane, and the Tesla simply encounters it there, would it stop.

Why? Simple. Fabric is transparent to radar. Its a common issue with fabric and wood airplanes.
And Tesla uses radar to establish ranging with other cars in front. The ultrasonic detectors also on the front have limited range.

Rafael Sabatini

Is it transparent to “visual spectrum light”, too?

Cause, dude, most people (virtually ALL) will see the car stopped ahead and decelerate. NOT go faster toward the crash like a moth to flame.

menorman

Pretty sure Elon has spent a lot of time touting how cameras will be “enough” for both Autopilot and Full Self-Driving, so this should be nearly irrelevant.

Pushmi-Pullyu

Even if Elon is eventually proven correct to say that relying on cameras is adequate, that doesn’t address the situation under discussion here. Maybe someday Tesla’s self-driving or semi-self-driving cars will rely on camera images to “see” stationary obstacles (altho I doubt it), but that’s certainly not true at this time.

Will

Scott, are you daft? The test revealed a situation where the car failed to react. How is the test flawed when it successfully revealed a safety concern?

This test was specifically conducted to try and create situations and see how autopilot reacted. Sure, they could have made a nice lovely situation the car can easily react to, but what does that prove?

If people only tested things in ‘ideal world’ situations like you suggest and seem to imply would not be a flawed way to test things, we’d never find deeper faults in stuff because we’d always be putting stuff through it’s regular, humdrum everyday paces.

Pushmi-Pullyu

Scott is absolutely correct to question whether the Doppler radar used for ABS systems in pretty much every production car, including Tesla cars, will react to a lightly built mockup of fabric around a wooden frame, in the same way it would react to an actual car.

Now, you are also correct to say that this test does point to a flaw in ABS systems — not just Tesla’s ABS system — but you being right doesn’t mean Scott is wrong.

Will

PS: About fabric or wood not being detected by the sensors, sorry but that’s a concern and if they want these things to eventually be autonomous they need to address that. It’s rare, but I have on one occasion seen a massive slab of wood (which hitting would have been BAD) right in the middle of a 70 mph carriageway. The vehicle in front lane changed at the last second so I reacted by going on the hard shoulder instead of braking or forcing my way into the next lane over.

If we want full autonomy, every single object blocking the road needs to be detected, no matter what it’s made of, how dense it is, etc.

Pushmi-Pullyu

I applaud you for actually using critical thinking, which sadly almost nobody else commenting on this issue is doing. I too wondered if the Doppler radar that ABS systems use to “see” cars in front of them could detect a lightly built structure of fabric around a wooden frame, or if the radar beams would just pass right through, as they do with styrofoam.

But that’s not actually why the ABS system failed in this test. It failed because, just like all or nearly all ABS systems in production cars, it’s not designed to detect or react to stationary obstacles. That’s why Tesla cars sometimes run into parked fire trucks and other stationary obstacles.

https://arstechnica.com/cars/2018/06/why-emergency-braking-systems-sometimes-hit-parked-cars-and-lane-dividers/

u_serious?

Articles like these on Tesla AP should really post the instructions on where and when to use AP and instructions.

Most people that don’t own one have no clue on how/where/when it’s supposed to be used.

dan

From the looks of it, most people who DO own one seem to have no clue on how it’s supposed to be used. The ones who DON’T own one don’t care.

Pushmi-Pullyu

“From the looks of it, most people who DO own one seem to have no clue on how it’s supposed to be used.”

Sadly, that is true. 🙁

“The ones who DON’T own one don’t care.”

Speak for yourself. I think that every sane adult capable of critical thinking would and should be concerned about semi-self-driving tech, since they will be driving or riding on roads shared with cars which do have that tech!

Jason
One use case is when you’re driving along a highway. That’s this case. The car in front moved to the other lane because of the stationary object. This is a common situation, so this is within the use case of AP and the reason the driver is still the driver and supposed to be paying attention. Now if Tesla can’t see stationary objects (especially if they are the size of a car, or bigger) then wouldn’t you think the logic of the program should be: 1) I’m following this object in front of me 2) the object in front of me has moved out of this lane 3) why would the object move out of the lane? 3a) maps shows what lanes there are, I know what lane I’m in, the object moved to an off ramp lane, that’s Ok 3b) maps shows what lanes there are, I know what lane I’m in, the object moved to a faster lane, that’s indicates another object in the way of the object I was following 3c) maps shows what lanes there are, I know what lane I’m in, the object moved off the road, that’s a concern as it indicates there is… Read more »
Pushmi-Pullyu

“According to the study, the problem with Tesla Autopilot – and virtually all current semi-autonomous driving technology – is that people become too comfortable with it.”

This is only a symptom of the problem, not the actual cause. The problem is that most people, even those reading and posting to InsideEVs, don’t understand how limited automated lane-keeping systems are, and what they are not designed to do.

A relevant quote:

A natural reaction to these incidents is to assume that there must be something seriously wrong with Tesla’s Autopilot system. After all, you might expect that avoiding collisions with large, stationary objects like fire engines and concrete lane dividers would be one of the most basic functions of a car’s automatic emergency braking technology…

As surprising as it might seem, most of the driver-assistance systems on the roads today are simply not designed to prevent a crash in this kind of situation.

Sam Abuelsamid, an industry analyst at Navigant and former automotive engineer, tells Ars [Technica] that it’s “pretty much universal” that “vehicles are programmed to ignore stationary objects at higher speeds.”

Full article: “Why emergency braking systems sometimes hit parked cars and lane dividers”

https://arstechnica.com/cars/2018/06/why-emergency-braking-systems-sometimes-hit-parked-cars-and-lane-dividers/

Ambulator

Everyone agrees it is happening. The question is whether it is acceptable.

Gordian Hense

The “dummy” is from or in Germany! It is clear with what intention this “TEST” was made. So silly, LOL! The “dummy” is made out of cardboard! So silly! While Mercedes only uses Video-Cams, Tesla uses Infrared and Video-Cams. If an amount of metal/meat is not recognized, then it is clear that the computer didn’t react. No realistic environment! So this is a Fake-Test to show how good Mercedes is and how bad Tesla. LOL. That is so silly. LOL.

Gordian Hense
Jason

They show a BMW that can’t keep within the lanes around a slight deviation. They do not show any other make or model of vehicle doing this test. Mercedes is just one of the supporting vehicles the Tesla is following, it is not part of the test, it is not indicated it is part of the test.

Jason

This would have been a great demonstration if they did two things:
1) describe how the dummy car was designed to react the same as a normal car, eg: using radar reflective paint, or other features;
2) doing the same test with all samples of Lane keep assist vehicles available.
Can GM Super Cruise enable vehicle manage this test? Can Suburu with its system? Even the recent accident with Uber hitting a cyclist would indicate one of the renowned systems might fail this test.
It would be very interesting to see the AP screen during the test as well. Does the vehicle appear on the dash screen at any point at all?
Of course there are a lot of Tesla AP vehicles out there, how many of the other manufacturers vehicles are there with this suit of system? If it is similar 200,000+ then that tells us something compared to if it is only a few thousand. As a population increases the incident of anything happening increases.