Tesla Autopilot And Other Braking Systems Are Blind To Parked Fire Trucks

Tesla Autopilot

JAN 26 2018 BY STEVEN LOVEDAY 60

Tesla Autopilot crash

Tesla Model S crashes into a parked fire truck. The driver says Tesla Autopilot was engaged. (Image Credit: Culver City firefighters via Twitter)

It’s not just Tesla Autopilot that fails to see stationary objects.

As we previously reported, a Tesla Model S crashed into the rear of a parked fire truck near Los Angeles this week. According to the driver, the vehicle was in Autopilot mode, a semi-autonomous driving feature that assists the driver under certain conditions (but requires continuous driver engagement and hands on the wheel).

Of course, people were shocked that an advanced vehicle would just fail to notice a huge red truck in its path. Even if Autopilot wasn’t engaged, shouldn’t the car’s standard Automatic Emergency Braking (AEB) kick in? The answer is … not necessarily.

Tesla Autopilot

Tesla Autopilot 2.0 is designed to see at least one car ahead, so it may lessen this problem. However, if it’s simply a stationary object, you may be out of luck.

Although the AEB may have assisted, since the driver was said to have been traveling 65 mph and yet there was not a substantial amount of damage, nor were there any injuries. AEB’s job is not to completely stop the vehicle prior to a crash, but instead, to slow it down reasonably to lessen the impact.

The automaker has yet to confirm whether or not Tesla Autopilot was engaged. Tesla also hasn’t specified if AEB kicked in or not. The company does offer the following warning in its manual (via Wired):

“Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”

Let’s think this over for just a minute. If your car hits the brakes every time there’s a piece of debris on the road, or it sees any type of stationary object, this could actually cause an accident, or give the driver whiplash. False positives can be a real problem for such technology. All current automatic emergency braking and adaptive cruise control systems are designed to be “blind” to various fixed objects. If not, the system wouldn’t be able to do its job.

According to Wired, Volvo’s Pilot Assist system is much the same. The vehicles’ manual explains that not only will the car fail to brake for a sudden stationary object, it may actually race toward it to regain its set speed:

“Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed. The driver must then intervene and apply the brakes.”

Radar detects the speed of objects. It also sees medians, road signs, traffic lights, etc. While it’s good at detecting something that’s moving, it’s not so good at “seeing” the fixed objects, and programmers are making sure that it doesn’t react to something stationary. This is a good thing, however, since it would cause significant havoc if cars on the freeway were constantly slamming on their brakes for no reason.

Tesla Autopilot

Cruise Automation Chevrolet Bolt EV with lidar

What’s the solution?

Wired explains:

“The long term solution is to combine a several sensors, with different abilities, with more computing power. Key amongst them is lidar. These sensors use lasers to build a precise, detailed map of the world around the car, and can easily distinguish between a hub cap and a cop car. The problem is that compared to radar, lidar is a young technology. It’s still very expensive, and isn’t robust enough to survive a life of hitting potholes and getting pelted with rain and snow. Just about everybody working on a fully self-driving system—the kind that doesn’t depend on lazy, inattentive humans for support—plans to use lidar, along with radar and cameras.”

“Everybody” doesn’t include Tesla CEO Elon Musk. Musk has repeatedly spoken to his lack of commitment to lidar. Is this because he’s well aware of the above?

Lidar is new tech, and it doesn’t hold up to weather conditions. Not to mention, it’s currently pricey. One would think that once the tech is more heavily tested, improved, made to be more durable, and is cost-effective, Musk would have no grounds to argue against it. However, at this time, Tesla won’t be moving to lidar. Nonetheless, Tesla vehicles have been spotted out testing with lidar in place.

Lidar or not, you also won’t easily find a system in any of today’s cars that will be sure to stop for a parked fire truck taco truck. Cruise Automation’s autonomous Chevrolet Bolt uses lidar. While it didn’t crash into a parked taco truck (albeit this was at a very slow speed), it was surely confused by the presence of the stationary obstacle.

Source: Wired

Categories: Crashed EVs, Tesla

Tags: ,

Leave a Reply

60 Comments on "Tesla Autopilot And Other Braking Systems Are Blind To Parked Fire Trucks"

newest oldest most voted
(⌐■_■) Trollnonymous

Meh, I don’t care for any auto pilot anything.
In all cases you still have to just as alert as if it was not on.

Why the efff even bother unless your trying to have sex while driving…….lol and that’s not SAFE!

TwoVolts

“In all cases you still have to just as alert as if it was not on.”

Yes – exactly. The driver needs to stay alert and not ‘check out’, and is ultimately responsible for controlling the vehicle. The problem with Autopilot – or any system that is not fully autonomous – is that it encourages the driver to check out to some degree, as drivers learn to trust the car more than they should. I’m convinced that AP is a waste in its current state – other than as a toy to ‘show off’ to your friends. Either the human or the car should have full autonomy – not a flawed mix of human and car.

Steve

Ok, so whats the point of it then? How do you know when it doesn’t see something and you need to take over? How about when it does see something, what is the reaction time of the car?

People using this are taking a real risk. I believe beyond the risk of the person who’s aware and driving properly.

Steve

Sorry – Responded to the wrong post…

(⌐■_■) Trollnonymous

IMHO, it gives a false sense of security.
My coworker disables his AEB because of too many false positives.

Pushmi-Pullyu

“I’m convinced that AP is a waste in its current state – other than as a toy to ‘show off’ to your friends. Either the human or the car should have full autonomy – not a flawed mix of human and car.”

While your argument is logically valid, it fails a reality check. The NHTSA says that Tesla cars with Autopilot + AutoSteer installed have a 40% lower accident rate than Tesla cars without those installed.

I think what your argument fails to take into account is that the average driver doesn’t maintain full awareness of the road at all times. People get distracted, and not just by texting on a cell phone. I once rear-ended another car because I was looking for a street I was not familiar with, and I spent too many seconds trying to read a street sign instead of noticing that a car had stopped in my lane to make a left turn.

Computers don’t get distracted.

TwoVolts

I don’t dispute the NHTSA report. However, there are very effective solutions to prevent rear-ending someone that don’t require AP. Until we reach higher levels of and ultimately full autonomy, I maintain that the best combination of driver and car to prevent accidents is one in which the driver is in full control, and the car plays a secondary role by assisting the driver with warnings, automatic braking, etc. Tesla AP flips those roles so that the car is in control and the human becomes the backup in assisting AP. That also can be effective. But when human drivers start to believe that AP can solely handle the driving duties and they check out mentally, redundancy is lost and safety suffers. My worry is that more Tesla drivers are becoming too comfortable with AP, and are not playing the necessary role of vigilant backup.

ModernMarvelFan

Pushmi-Pullyu wrote:”The NHTSA says that Tesla cars with Autopilot + AutoSteer installed have a 40% lower accident rate than Tesla cars without those installed.”

That is not what NHTSA said.

Here is what NHTSA said:

“The data (supplied by Tesla based on airbag deployment) show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation. ”

“Approximately one-third of the subject vehicles accumulated mileage prior to Autopilot installation. The crash rates are for all miles travelled before and after Autopilot installation and are not limited to actual Autopilot use. ”

The actual rate is 1.3 crash per million miles before the autosteer and 0.8 crash per million miles after autosteer.

Based on :
(Figure 11. Crash Rates in MY 2014-16 Tesla Model S and 2016 Model X
vehicles Before and After Autosteer Installation.)

There is NO DIRECT comparison of vehicles before and after autopilot, but only the total/diluted crash rate per miles traveled that included both cars.

That is ONLY based on airbag deployment rate which can be reduced by AEB to reduce crash speed.

But it doesn’t separate the two groups of vehicle out in direct comparison.

Here is the official document: https://static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.PDF

Nix

“The driver needs to stay alert and not ‘check out’”

Too late for that. Every time I drive I’m surrounded by drivers who have ‘checked out’ and they don’t even have auto-pilot. They have cell phones — no autopilot.

Auto-pilot exists to save us from drivers who have already checked out long before there was ever autopilot.

TwoVolts

Is the ‘checking out’ you have seen this bad?

https://youtu.be/qnZHRupjl5E

One more thing where GM seems to have the upper hand on Tesla in addition to autonomous driving tech. The Bolt/Ampera-E looks like it would have no issues detecting a stopped car (or firetruck) with its AEB system:

https://www.youtube.com/watch?v=rXLsb54QN5c&feature=youtu.be

Last 30 or so seconds of the video contain the AEB/pedestrian braking tests.

DonC

A low system is not the same as a high speed one. The Bolt EV doesn’t even have adaptive cruise control. If it did, it would not have stopped for the truck since ACC only detects changes in speed of moving vehicles. It’s counter intuitive, but the form of radar used in ACC doesn’t detect stopped objects.

The better contrast would be to Super Cruise used on the Cadillac CT6. AFAIK that’s the most advanced cruise available in a production car. While I’m not sure, my guess is it would have handled this situation and stopped. But that’s a different and more advanced technology.

Steve

I mean, the Tesla did have ACC and it didn’t stop either, kind of a strange comment considering the context.

DonC

Strange? Not really. The point is that the Bolt EV wouldn’t have stopped even if it had Adaptive Cruise Control, which isn’t an option.

As noted below, the question is why the camera based system in the Model S missed the fire engine. This system is supposed to “see” things that the radar doesn’t, including the lines of the road.

So yeah, AP isn’t working.

Pushmi-Pullyu

“One more thing where GM seems to have the upper hand on Tesla in addition to autonomous driving tech. The Bolt/Ampera-E looks like it would have no issues detecting a stopped car (or firetruck) with its AEB system:”

Excuse me, but are you seriously trying to tell us that the Bolt EV will never (or even almost never) get into an accident involving it hitting a parked car or, in fact, any stationary solid object?

Because that’s what you’re claiming.

Hopefully it’s not necessary to point out it’s ridiculously easy to prove this is factually incorrect.

Nix
bro1999, you are so horribly wrong it isn’t even funny. This accident happened at 65 mph. Not a single Bolt has high-speed AEB that operates at that speed. The Bolt only has what GM calls “low speed AEB”. Here is what GM says about the Bolt’s slow-speed AEB: “The system works when driving in a forward gear between 8 km/h (5 mph) and 80 km/h (50 mph). It can detect vehicles up to approximately 60 m (197 ft).” Read that again slowly. 50 mph. 50 mph max. Anything faster and it simply isn’t even trying. Even worse, that low-speed AEB isn’t even standard like in the Tesla. It is only available as an extra cost option. And then only in the highest trim. And then only after you buy two additional packages. So the vast majority of Bolts don’t even have ANY AEB AT ALL, and the Bolts that do have AEB wouldn’t have been active at that speed. You are trying to say that a system that either wouldn’t exist or wouldn’t have been active in the Bolt, is better than a system that very well may have applied the brakes and slowed the vehicle enough that the accident… Read more »
Bunny

When the stopped object becomes a school bus filled with children, I don’t think they will call it autopilot anymore.

If not a matter of if, it’s a matter of when.

SansIce

True but most people would argue that obstruction sensing tech is an overall net positive for safety. Crashes like this get all kinds of press but what about the crashes where there was no such technology and said technology would have prevented the crash? What about all of the accidents that were avoided because of this technology when it was in place. That never makes the news. In short, we are seeing a skewed view of how this innovative and soon to be ubiquitous tech fails instead of the countless lives it will and has saved. It is more fun and satisfying for the haters to say see… it sucks, Tesla sucks, its not ready for prime time instead of wow it isn’t perfect but we are going to be safer due to all of these innovations. I can’t wait until all cars have that technology.

Bunny

I agree with you on welcoming the technology.
I just think naming it autopilot was way over the top and a really stupid decision. It’s just begs for bad press when something goes amiss, even when it’s working as designed. Hurts the whole progress of the industry.

Pushmi-Pullyu

“…we are seeing a skewed view of how this innovative and soon to be ubiquitous tech fails instead of the countless lives it will and has saved.”

Exactly, and thank you.

This is the same stupid argument that opponents of mandatory seat belt laws used, in the days before air bags. “Well, what if you’re trapped in a burning car by a jammed seat belt, and you can’t escape? Wearing a seat belt might kill you!”

Perhaps there have been very rare cases where someone was in an accident severe enough to jam his seat belt but leave him conscious and able to climb out of the car, but the odds are far, far greater that (in a car without air bags) you’ll simply be killed if you’re in a severe accident and you’re not wearing a seat belt.

Wearing seat belts saves lives. So do driver assist safety systems such as ABS and Tesla Autopilot + AutoSteer.

If someone wants to argue that we shouldn’t use these systems despite the fact that they save lives, then they should at least be honest and admit that’s what they are arguing.

David Murray

I disagree. Auto-pilot was perfectly named, considering its origins for helping pilots fly aircraft for long distances. And, just like airbags, which can sometimes cause unwanted injuries, the overall benefit of putting airbags in cars is clearly obvious.

Bunny

But with airplanes you also have TCAS
to avoid collisions.

Chacama

Precisely: The TCAS warns pilots to take corrective actions vs taking control over the situation.

Asak

To be fair, the mass difference is so great that I doubt anyone on the school bus would be seriously hurt. The car is going to end up suffering the vast majority of the forces from the crash.

Six Electrics

A known flaw that Tesla chose to ship with, causing least one AP death in China.

Known and Publicly Warned about Situation: but if you loan your Tesla to another person, it’s on you to tell them.

It is part of the Training program that I have suggested that Tesla Could and Should teach in Classes, Night School, etc: “How to use a Tesla, Safely!”

Pushmi-Pullyu

“A known flaw that Tesla chose to ship with, causing least one AP death in China.”

As usual with posts from “Six (Pretend) Electrics”, this is a lie or at best a half-truth.

A family in China sued Tesla, claiming — without evidence — that a driver was killed operating a car under Autopilot (or more precisely, AutoSteer). The family refused to allow Tesla to examine the car to find out if AutoSteer was actually engaged at the time of the accident.

And even if AutoSteer was activated, so what? The driver is warned repeatedly to observe the road and be ready to take over at any time. In fact, Autopilot occasionally tests to see if the driver is paying attention, and if he’s not, it pulls over to the side of the road and stops.

Use of AutoSteer does not remove the responsibility of driving from the driver.

The Woodster

As usual, Six Electrics brain can ONLY function in autopilot and has no human backup.

DonC

The radar used for automatic cruise control is not terribly robust. It’s a crippled form of doppler that detects only relative moving speeds. Hence it won’t “see” a stationary object. Contrary to what the article implies, this is a limitation not a feature. Assuming it’s half functional there shouldn’t be a problem distinguishing between a piece of road debris and a fire truck.

The bigger problem is that AP apparently doesn’t work. AP is supposedly a camera based system. Given viewing conditions were good, if the system worked then the car would have literally “seen” the truck. Note this is not the same at the tractor trailer in Florida where the light reflecting off the trailer may have confused the camera. No excuses here.

Pushmi-Pullyu
“The bigger problem is that AP apparently doesn’t work. AP is supposedly a camera based system. Given viewing conditions were good, if the system worked then the car would have literally ‘seen’ the truck.” Wow! That has so many errors of fact and logic that I hardly know where to start. 1. Cameras don’t literally “see” anything. Nor does the software optical object recognition system used by Autopilot to (unreliably) detect objects. Cameras may work like eyes, but it’s not your eye that “sees” something; it’s your brain with its highly evolved and very complex visual cortex, which interprets visual images. A cortex that is utterly lacking in computers. 2. This has little or nothing to do with “viewing conditions”. This has to do with the fact that current cars, even ones with driver assist features such as “Autopilot”, don’t have the kind of situational awareness that will be needed for full autonomy. 3. Proper situational awareness in a self-driving car will require a LiDAR based SLAM* system. No production car, including Tesla cars, currently uses LiDAR. *SLAM stands for Simultaneous Localization And Mapping technology, a process whereby a robot or a device can create a map of its surroundings,… Read more »
TwoVolts

Pu-Pu,

You stated: “The problem is that a lot of people can’t understand that there are varying degrees of autonomy; it’s not a all-or-nothing thing. Just because a Tesla car equipped with Autopilot + AutoSteer can perform lane-keeping reasonably well under most circumstances, and also does reasonably well at following the car ahead, that doesn’t mean it can “see” anything in the way humans (and other animals) see things. Its sensors can detect certain things, and it can (apparently reliably) read traffic signs. But it does not actually “see” the world around it, and potentially that includes a fire truck stopped in your lane.”

You are making my case – although I would dispute that it does these tasks reasonably well. Your are perfectly describing a system that is not ready for use by the general public. In other words, it is not ready for prime time. See the link for a compilation of problems – from a single driver – with the system Tesla calls ‘Autopilot’.

https://youtu.be/QyMBTXpNOhg

TwoVolts

Pu-Pu,

We are currently in that uncomfortable middle ground where cars have achieved some autonomy, but full autonomy has not arrived. I believe the experience of the airline industry – which has already been on this path far longer – is most instructive.

https://youtu.be/llEJQguw2Zs

John Doe

Lidar is not a new tech. The military has used it for decades.When I served, it was used in airborn laser scanning of the ground, for mapping and strategic reasons. Was also used in automatic fire stations/ sentinens that scanned the set sectors for enemy soldiers or vehicles. At the same time, it has to see the difference between a soldier and a moose, so it does not slay all the wildlife passing it in the woods.
Not just to save the wildlife, but to save the ammo for the enemy.

It is also used in some missiles, troop transporters and so on.

New in civilian cars yes, but the technology has been used for decades.

Pushmi-Pullyu

LiDAR is currently being used in test vehicles and prototype self-driving cars, but not — so far as I know — in any production cars.

I think we’ll start seeing the first production cars equipped with LiDAR within the next couple of years or so. Tesla is currently resisting use of LiDAR in their production cars, but I expect that to change.

fasterthanonecanimagine

How about ultrasound? Works for bats.

Pushmi-Pullyu

Tesla cars equipped with Autopiot do use ultrasound detectors, but they are only effective at very short ranges; as I recall, less than 20′.

More sophisticated forms of radar are available, such as high-resolution radar, altho I don’t think that tech is being used in cars. All around, LiDAR is best, and prices for solid-state LiDAR scanners have recently dropped down to where it’s reasonable to think auto makers can start putting them into mass produced cars.

Another Euro point of view

This has/will kill more people. It is only a question of time. It is not only Tesla, I watched a new Audi A4 test drive recently on Youtube, it was equipped with a similar autopilot system along with road signs reading. It drove on the highway at 130 km/h then suddenly the silly road sign detection system spotted a 70 km/h speed limit sign on a small road that was just next/going parallel to the highway, it resulted in rather hard braking and no one is expecting a car to reduce its speed from 130km/h to 70km/h for no reason on the highway so the stupid autopilot could have very well caused an accident. I guess a few casualties are needed “for science”. As long as those victims are single male nerds I take it society will tolerate this rather well, a mother with 2 kids that would be another matter as far as public perception is concerned.

TwoVolts

Can you include a link to the Audi test drive video?

terminaltrip421

if it’s anything like gun deaths in the U.S. no amount or type of deaths will be end its proliferation.

Pushmi-Pullyu

“This has/will kill more people. It is only a question of time.”

So will air bags. Should we stop using air bags?

It’s notable that the only people here actually suggesting that Tesla should stop using Autopilot are those who have established reputations as Tesla Hater cultists: Steve, bro1999, Six Electrics, and you, Another Euro.

Coincidence? Hmmmm… no.

darth

Humans do this with only 2 visible light sensors (cameras). Therefore, it must be possible to do it with only that and the correct software.

That said, it seems like the software has a ways to go yet.

justanotherguy50

Humans are pretty bad drivers, though. That is the whole point of self-driving cars — to be safer than humans.

terminaltrip421

distracted, angry et al humans are bad drivers to be sure but humans as a general rule I would say are capable of exceptional driving. ever seen stunt car driving? or people with a spotless accident record despite sharing the road with idiots..

Asak
The thing about humans is we’re capable of driving well (at least some of us), but even as a good driver it’s hard not to get distracted or make a rare mistake, especially after driving for long periods of time, or if you’re tired. I’m generally a good driver, but I had an incident in a mall last year where I came up to a stop sign, looked both ways, and then hesitated because I couldn’t decide which way I should go. Then when I had decided I almost drove out in front of an oncoming car. The trouble was in my mind I had already looked, so I didn’t think to look again. But a few seconds had passed as I hesitated–more time than I realized–and things had changed. The moral of the story is that everyone makes mistakes, even the best of us. I have no real doubt I am more aware of the road than a computer can be–what’s going on in multiple lanes, whether another car is acting erratically or looks like it’s “thinking of changing lanes”. I’ve had situations where I was able to anticipate a driver doing something weird and avoid danger just by… Read more »
Tim Miser

That’s exactly right. The software is nowhere near where it needs to be. There should be no need for radar, Lidar, or more than 2 forward cameras.

Pushmi-Pullyu
Nope, not at all. The human eye can’t see in the dark, can be blinded by glare reflecting off polished surfaces, and is subject to optical illusions. Furthermore, it’s not the human eye that gives humans superior vision; it’s the highly developed visual cortex in the human brain. It’s absurd to talk about making self-driving cars that perceive the environment as humans do. Computers aren’t equipped with our “wetware”, and the goal of those designing self-driving cars isn’t to mimic human driving — it’s to produce cars which drive more safely than humans do. Far more safely. Part of that safety is going to come from using LiDAR-based SLAM* systems to perceive the environment, vastly superior to the use of stereo camera images and unreliable optical object recognition software. Such software is a rather poor substitute for the visual cortex in the human brain, and anyway active scanning with LiDAR or radar is far preferable to, and much simpler and more reliable than, passive scanning using video cameras, which requires use of software to (unreliably) interpret visual images. Active scanning has the added advantage that it works just as well in the dark, unlike the human eye. *SLAM stands for… Read more »
DonC

A camera is not a human eye any more than a prop plane is a fighter jet, and a Nvidia chip is not a human brain.

Also there is no reason to limit yourself to human senses if the goal is to be better than a human driver.

???

Actually, Subaru’s system of dual cameras works very well.

Some cars use 4-6 forward facing cameras (eyes), and still go bump in the daylight! With 3 very sophisticated computers (Brains) inside the vehicle!

However, I have yet to see demo video from either Tesla or its new GPU Supplier, share a Video demoing obstacle avoidance at 65 Mph so far!

Nelson

I don’t understand why camera data combined with radar data can’t validate a stationary large object getting closer “directly in front of” the fast moving vehicle. Radar should detect distance between car and object getting smaller while camera sees object getting larger, apply brakes, warn driver, something.

NPNS! SBF!
Volt#671 + BoltEV

Pushmi-Pullyu

Then you need to read DonC’s posts here. If he’s right, then radars used in current cars are not designed to detect stationary objects at all.

I think that many or most people who are not computer programmers (as I am) fail to understand that computers, even when connected to cameras, simply don’t have any situational awareness. They don’t “see” anything. The radar systems in cars equipped with ABS can only detect certain things; the things that they are designed and built to detect.

Apparently that does not include fire trucks or other large obstacles stopped in the lane of traffic where you are driving.

Nix

“directly in front of”

Because all roads aren’t straight. If the road curves, then everything that is directly in front of a car is on the side of the road until the car completes the turn.

A stationary object being directly in front of a car at a distance isn’t sufficient information to trigger the AEB.

Priusmaniac

But a car in front and with a too sharp turn needed to avoid it should still trigger the AEB.

EVShopper

Seems like a protocol is needed for emergency vehicles to broadcast some kind of signal to autonomous/semi-autonomous vehicles to alert them of their presence. Could also be a version engaged for stalled vehicles on the side of the road. Like emergency blinkers warn humans something is amiss and they should drive by with caution. When you engage the emergency blinkers, a signal could be recognized by autonomous systems in passing cars.

V2V communication will be needed to make these systems safe.

(⌐■_■) Trollnonymous

……or the “Driver” needs to friggin pay attention!

Pushmi-Pullyu
While that certainly will be needed in the future, when semi-self-driving cars achieve true Level 3 or better autonomy. But at the current state of development — Tesla Autopilot is Level 2 with some aspects of Level 3 — it would be like equipping a Roomba with radar. The radar isn’t going to be of any benefit in a robot that’s designed and built to only stop or change directions when it bumps into things, and cars with a mere Level 2 autonomy aren’t equipped to detect stationary obstacles and steer around them. * * * * * Level 0 _ No Automation System capability: None. • Driver involvement: The human at the wheel steers, brakes, accelerates, and negotiates traffic. • Examples: A 1967 Porsche 911, a 2018 Kia Rio. Level 1 _ Driver Assistance System capability: Under certain conditions, the car controls either the steering or the vehicle speed, but not both simultaneously. • Driver involvement: The driver performs all other aspects of driving and has full responsibility for monitoring the road and taking over if the assistance system fails to act appropriately. • Example: Adaptive cruise control. Level 2 _ Partial Automation System capability: The car can steer,… Read more »
Pushmi-Pullyu
The only thing that should be surprising is that anyone would be surprised about this. If ABS really did eliminate nearly all traffic accidents involving the moving car hitting a stationary object, then everybody would already know that. ABS systems certainly help reduce accidents, but anyone who thinks they will stop the car in most cases of potential collision, is pretty clueless. * * * * * The article says: “If your car hits the brakes every time there’s a piece of debris on the road, or it sees any type of stationary object, this could actually cause an accident, or give the driver whiplash. False positives can be a real problem for such technology. All current automatic emergency braking and adaptive cruise control systems are designed to be ‘blind’ to various fixed objects. If not, the system wouldn’t be able to do its job.” That explains pretty well why the ABS system has to be “blind” to certain obstacles, but the real problem with false positives isn’t a danger of whiplash or that it could cause an accident by the car stopping unexpectedly. The real problem is that if the system is so sensitive that it results in many… Read more »
Pushmi-Pullyu

If DonC is right about the limitations of radar detectors connected to current ABS systems, then perhaps Autopilot can detect a moving car two cars ahead, but not a stationary one.

One thing I haven’t seen discussion of, on EV forums, is just how much (or how little) data/resolution is available from the type of radar scanner in use for ABS systems. If the image return is as ill-defined as the example linked below, then the wonder isn’t that a car equipped with ABS hit a stationary truck; the wonder is that it will ever detect the need to brake at all!

comment image

zzzzzzzzzz

AEB is getting better in recent cars, even cheap ones. They not just stop from high speeds but also automatically steer around stopped car or truck.