Tesla Told To Improve Autopilot, Release Claimed “World’s Safest” Data

APR 16 2018 BY STEVEN LOVEDAY 129

Consumer Reports’s Advocacy division is making new demands related to Tesla’s Autopilot system.

The Consumers Union (CU) group (a division of Consumer Reports) has called Tesla out for its Autopilot system, obviously due to the recent fatal Model X crash and related media coverage. Tesla has been asked to improve the system, as well as to release a new statement explaining its claims that Autopilot is the “world’s safest” system. The Union wants more public data supporting such claims.Tesla Autopilot

Related: Tesla Fires Back: NTSB Removes Tesla From Investigation Into Deadly Model X Crash

Tesla has admitted that Autopilot was engaged during the deadly crash, and this was also the case in an earlier fatal incident in Florida. However, the automaker believes that the drivers should have been paying attention.

According to CU, Autopilot should limit its use to areas in which it can be used successfully. It believes that the safety system is able to be activated when it’s not necessarily safe to use. Additionally, it’s concerned that Tesla’s “hands-on” warning isn’t enough. Director of Cars and Product Policy and Analysis for Consumers Union David Friedman explained:

After another tragedy involving Autopilot, Tesla should commit to put safety first—and to stop using consumers as beta testers for unproven technology. While the results of the crash investigations will be critical to understanding all that contributed to this tragedy, previous NTSB findings already showed that Autopilot should do more to protect consumers. We see no excuse: Tesla should improve the safety of Autopilot without delay.

Tesla markets itself as an innovator. It should not put lives at risk, damage its reputation, or risk the success of its systems—or driver assist technology as a whole—by failing to take steps that would better protect consumers’ safety. Further, the company should not make either specific or broad safety claims without providing the detailed data to back them up. They should show, not just tell, us how safe their system is.

Instead of issuing a defensive Friday evening blog post or statements blaming the victim, Tesla should fix Autopilot’s design and be transparent about their safety claims. The company should publicly provide detailed data to demonstrate conditions for which its Autopilot system can safely operate. It should limit Autopilot’s operation only to those conditions, and have a far more effective system to sense, verify, and safely react when the human driver’s level of engagement in the driving task is insufficient or when the driver fails to react to warnings. If other companies can do it, Tesla should as well. Further, this would fulfill the NTSB recommendations made more than six months ago.

Consumer Reports and Consumers Union have asked automakers to do a better job of making sure drivers are aware of each systems’ limits, as well as assuring that there is some backup system in place in case a driver overestimates the technologies’ weaknesses. In regards to Tesla, the organizations have already requested that the Autopilot system should cease to operate in certain situations.

Consumers Union’s recent article explains:

In addition, Consumers Union urged the U.S. Senate and NHTSA to take action in response to the NTSB’s September 2017 recommendations and require critical safeguards in vehicles with partially or conditionally automated driving technologies. The NTSB’s recommendations included that the Department of Transportation and NHTSA should develop and issue mandatory performance standards for these systems and ensure better collection of crash data. The NTSB also recommended that manufacturers should limit (and NHTSA should verify that they have limited) the use of automated driving systems to appropriate circumstances and develop systems to more effectively sense a human driver’s level of engagement and alert the driver when automated driving systems are in use and the driver is inattentive.

Source: ConsumersUnion

Categories: Crashed EVs, Tesla

Tags: ,

Leave a Reply

129 Comments on "Tesla Told To Improve Autopilot, Release Claimed “World’s Safest” Data"

newest oldest most voted
Bobish

Well… The same concerns goes for driving a car in the first place.
The interesting question is whether Autopilot is better than no Autopilot.

“The interesting question is whether Autopilot is better than no Autopilot.”
Tesla claims that is the case. So why don’t they release all the data that led them to that claim for all of us to see!

Nix

Actually, NHTSA has already released all of that information. I have posted the link for you many, many times. You have steadfastly refused to read it.

Terawatt
Please post it more times. Since there’s no way to know if you replied and I may not remember to come back here it can take a few before I see it. And I certainly haven’t seen it so far. I’ve seen Tesla’s claim that autopilot is much safer. But my understanding is that this is based on miles driven per fatality in Teslas versus other cars, which doesn’t necessarily say anything at all about autopilot. In fact, it isn’t trivial to interpret such data. The best would be if you could compare Teslas with AP engaged to Teslas with AP disengaged, over the same stretches of road at the same time (same conditions, as near as possible making whether AP is engaged or not the only difference), but that’s probably a tiny data set. For rare events you need a big data set. It’s easy to choose data that leads to whatever conclusion one may want. For example, isn’t it correct that 100% of fatalities in a Tesla happened with AP engaged? If not, what is the percentage? Thankfully there have been very few fatalities. Which means such a percentage is extremely sensitive to randomness. But I think it… Read more »
Nix

In the amount of time you typed you diatribe, you would have had enough time to do a very simple google search a dozen times over and found multiple paths to the report.

Sadly, even when fed the report, you nutters STILL never accept the results, and never accept that NHTSA has SPECIFICALLY accounted for autosteer alone. Even in cars that previously has AEB, and had autosteer added via OTA update, the collision rate went down:

” 5.4 Crash rates
. ODI analyzed mileage and airbag deployment data supplied by Tesla for all MY 2014 through 2016 Model S and 2016 Model X vehicles equipped with the Autopilot Technology Package, either installed in the vehicle when sold or through an OTA update, to calculate crash rates by miles travelled prior to
21
and after Autopilot installation.
22
Figure 11 shows the rates calculated by ODI for airbag deployment crashes in the subject Tesla vehicles before and after Autosteer installation. The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation”

HH

I think this is a bit about confusion on what people mean by Autopilot.

Usually when people talk Autopilot, they think of the system that adjusts speed and changes lanes while driving.

The NHTSA investiation however also looked into emergency breaking as part of Autopilot. This is by now however a standard feature on most mid-range cars. When you read the entire report, it always only talks about the entire package. The core of what people perceive as “Autopilot”, i.e. the lane change, is never separated out.

In addition, naturally the report is about Autopilot version 1, whereas Autopilot version 2 is a completely different system. So again, the report does not speak to today’s autopilot systems being produced.

Please don’t accuse other people of writing diatribes if all they do is ask reasonable questions. It is impolite.

AP2 is completely different (AP1 used Mobileeye hardware) AND completely inferior to AP1. Just browse the Tesla owner forums for the AP1 vs AP2 discussion.

Tesla continuing to tout AP is 40% safer is based off supposedly data collected from cars running AP1. Since AP1 doesn’t exist anymore, claiming AP2 is also 40% safer is a complete crock. But Tesla does whatever it takes to spin the narrative in their favor, so they’ll continue to propagate the “AP is 40% safer” line while AP2 cars steer into concrete barriers/fire trucks and kill their owners and/or innocent bystanders.

pjwood1

NHTSA reiterated Tesla’s “40% safer than non-AP” claim, but I am unaware of the backup data being released. As CR says, “The company should publicly provide detailed data to demonstrate conditions for which its Autopilot system can safely operate. It should limit Autopilot’s operation only to those conditions,”

I haven’t seen any data beyond the basis of the claim, that all AP, vs non-AP miles show 40% fewer airbag deployments with AP on the whole time. That, and whatever the miles were that have accumulated under each condition. I don’t like this data point, for one, because the chump who tries to save a problem that AP got them into and fails, gets chalked up as the fallible human. Did any airbags deploy within a second of someone taking the wheel, and where would Tesla assign such a data point?

Nix

Here are the actual numbers. You won’t accept them anyways.

comment image

Some amateur excel graph with no supporting data provided or sourcing? Lol, what a joke. Did you make that impressive graphic yourself?

By definition auto pilot, shouldn’t be branded Autopilot if it requires driver engagement.

DT

In all cases (airplane, boat, car), Autopilot requires constant human supervision.

Autopilot on an airplane: a system used to control the trajectory of an aircraft without constant ‘hands-on’ control by a human operator being required. Autopilots do not replace human operators, but instead they assist them in controlling the aircraft. This allows them to focus on broader aspects of operations.

Autopilot on a boat: equipment used on ships and boats to maintain a chosen course without constant human action.

Tom Dually

This is not actually true, but please continue with your script

Nix

You better tell the FAA, because they REPEATEDLY tell pilots using autopilot that THEY are ALWAYS 100% responsible for the flight of the airplane.

“The competent pilot is ready and prepared to make a
transition to aircraft piloting at any time.”

“Be ready to fly the aircraft manually to ensure proper
course/clearance tracking in case of autopilot failure
or misprogramming.”

“As with all automated systems, you must remain aware of the overall situation. Never assume that flight director cues are following a route or course that is free from error. Rather, be sure to include navigation instruments and sources in your scan”

“You must be very careful to specify an appropriate vertical speed, as the aircraft will fly itself into a stall if you command the autopilot to climb at a rate greater than the aircraft’s powerplant(s) is/are capable of supporting.”

“You must remember that the altitude alerting system is designed as a backup, and be careful not to let the alerting system become the primary means of monitoring altitude”

etc etc..

https://www.faa.gov/regulations_policies/handbooks_manuals/aviation/advanced_avionics_handbook/media/aah_ch04.pdf

Licensed pilots go through thousands of hours of training to become certified in operating an aircraft (and REAL autopilot systems).

How much training do Tesla owners have to go through before being able to use AP? A handful of finger taps to quickly dismiss warnings that AP is still beta?

Hunter

There’s a very big difference between autopilots used on aircraft and Tesla’s Autopilot, namely the operating environment. IFR flight means filing a flight plan on where you’re going, communicating with ground control and maintaining minimum distances from other traffic. It allows the kind of time to recognize and react to a problem. Compare that to two cars approaching each other on a 2 lane road and separated by a matter of inches. If Autopilot craps out and you drift into the other lane, your reaction time could be almost zero.

Nix

Are you under the impression that the only source of head-on collisions is autopilot malfunction?

The reality is that people have been getting into head on collisions for decades and decades. The reality is that autopilot SAVES drivers from some head-on collisions, where there are no known failures of autopilot CAUSING a collision.

Yet again the airbag conundrum. Airbags kill people. That is a fact. But they save WAY more lives than they take. Thus safety is improved by airbags, despite deaths that likely would not have happened without airbag deployment. Same for seat belts, ABS, etc, etc.

Do you believe that there should be a different standard for autopilot than for any of these other VERY effective safety systems? If so, please justify why.

MDEV

Sure, driver assist is better, the problem is that my assistant do all for me, hence I can be kill to be so dunb to understand what the word means. When you activate the Autopilot it explains that you have to be aware the whole time, only a gold fish 1 minuet memory can be excused.

mx

Also, the more advanced cruise control’s too.
ACC: active cruise control. Sometimes it turns itself off under a bridge, or when the sky is too white.

You can probably find more accidents with ACC then in Tesla’s system. But, we’re not tracking them.

Driver = Responsible = 100% of the time,
especially if the system is flashing and beeping at you.

What Tesla needs in the next update is:
Pilot Monitor.
When the Pilot falls asleep, Auto pilot takes the next exit and PARKS.

Nix

Autopilot somewhat does this, but yes should evolve to even better. A drunk driver was found safely stopped passed out, where the Tesla safely stopped the car without the driver. A car without autopilot would have crashed.

Anti-Lord Kelvin
“Seriously thinking”: you will see one day CU asking to consumers to be able to disengage all autonomous systems in all cars (when the system will exist) because having these systems respecting the speed limits and manoeuvring as the rules dictate will be boring for humans used to be lazy with all the rules and avid to texting in their phone or having fun speeding at 100mph or more (distraction and speed, being the cause of the majority of deaths in the road). The same thing happened some decades ago when the seat belt was introduced. So, every time someone died in his car although having the seatbelt (because the car burned of other thing) these accidents made the headlines in news and there is still people who think that wearing seatbelts is more dangerous than not wearing them! About the use of systems that have been introduced to improve safety in cars, we had ABS and active control of stability in curves for example. Both gave a sentiment of security that leaves some to go faster than they would go if they don’t have them. I knew a guy whose cousin died because he used to go at high… Read more »
Nix

The first thing CU should do is demand all cars have autopilot, because it is already documented to save lives.

Then work with Tesla to improve it to save even more lives. Like seat belts that started as just lap belts, but then greatly improved when they became lap/shoulder belts.

hansolo

So true about accidents from overspeed.
But why the distinction between autonomous and ordinary cars? Why CU doesn’t require that every car should slow and deactivate if you don’t have hands on the wheel?
I respect CR for their pursue in quality/reliability but on autonomous cars they act against public safety.
There is NO car manufacturer responsability (moral or legal) if there is a driver in car and there are no malfunctions to car’s subsystems. Only on level 5 autonomous car with no driver a car manufacturer CAN ask for legal responsability.

TeslaPlease

Thank You Consumers Union for ‘speaking truth to power’ on this topic in a clear and unvarnished fashion.

I agree 100% with their sentiment and challenge Tesla to do better.

‘The same concerns goes for driving a car in the first place’ is a false narrative. Tesla from inception and to-date markets AutoPilot as being a ‘smart’ technology.

Before the latest AutoPilot update the software was not smart enough to recognize double yellow lines and keep the vehicle in it’s lane. In fact AutoPilot almost caused a head on crash with a semi before the driver intervened.

‘The interesting question is whether Autopilot is better than no Autopilot’.

The answer is already know… NO, Autopilot is not better than no AutoPilot if the software in it’s current iteration can CONTRIBUTE to an accident avoidable by a reasonable competent driver.

‘Tesla’s Autopilot | Why It’s Still In Beta | Edmunds’

(⌐■_■) Trollnonymous

Isn’t that necessary all the time?

Hell, cars have fuel gauges yet I still see dumbasses on the side of the freeway walking with a gas can back to their ride.

You can have many things prompt a driver and yet some will still get into trouble no matter what you put in as a failsafe.

NoDistractions

Two things come to mind:
The joke about being too stupid to own a computer: http://funehumor.com/fun_doc2/fun_0251.shtml

And then saying about whenever you design something to be idiot proof, the universe delivers a better idiot.

marshall

I fail to see how autopilot could not be any better then autopilot.

Two days ago, I saw another person drive through a red light and this morning apparently a stop sign doesn’t mean you make even a California stop!

I’ve gotten to the point where I look both ways before driving through an intersection. Hopefully, that will preserve my life and those of my passengers.

With sleepy, DUI, distracted and aggressive drivers, fully automated cars can’t come soon enough.

Vexar

The rolling California stop is actually configurable in your Tesla autopilot, depending on your geography. It is akin to the Boston “yield to pedestrians… or not” capability, and “Turn signal via my front fender into your rear quarter-panel” for taxi drivers in Washington DC.

The expert system behind the Tesla Autopilot clearly hasn’t gotten over what to do when there’s a lane split/merge, and it is highly reliant on data pattern consistency.

One of the ongoing gags in Minnesota is “why is my autopilot disengaging” with photos of their Tesla vehicles caked in snow or in blizzard conditions.

Elon admitted the current Autopilot is too brittle to drive coast to coast hands-free. His approach to the AI is all wrong. Sensors… whatever, fight if you must. However, his approach to the AI is the same as everyone else: Expert Systems. Wonder what that is? Dial up Moviefone if that number still works. Prompted responses. Just… bigger.

scott franco

Excuses. What do they teach you in driving 101? Never drive faster than you can stop if you see something up in the road. These computer controlled cars can see a freeway barrier coming and make an exact calculation of the force and time needed to come to a stop.

However, the AP is not programmed to do that, it is programmed to beep, then cancel control and leave it up to the driver to avoid the object.

It IS in fact possible to design an autopilot that can avoid any stationary object, just not moving ones like cars coming at you.

NoDistractions

Exactly. The problem is that people don’t seem to understand the limits of AP or driver assist systems in general and are lulled into thinking they can be even more distracted and inattentive.

AP doesn’t stop at red lights or stops signs either, BTW.

mx

Sounds like you’re the kind of guy who falls asleep with AutoPilot on. Do us all a favor and Turn it off and don’t use it.

Anti-Lord Kelvin

Autopilot (and other systems from other companies) is already better. Why? Because it’s humans who are in “beta”. Proof? Someone knowing that the system was not working in this exact location and using it? Driving in a beautiful day and not watching the car going straight to the concrete divisor?
CU should also ask to see the “logs” of the “autopilot” of the bus which had an accident right in the same place some weeks before and which had destroy the protection there. Maybe the driver was not a “reasonable” one? Or, more seriously, this particular point of the road is so badly design and maintained that both reasonable drivers and safety systems can’t avoid to have serious accidents there?
Now, I would like to see more about accidents caused by BMW, Mercedes and Audi more simple cruise control systems, but these should not be interesting to CU in making the headlines…

mx

CU has been consistent in bashing EV’s.
Which, anyone with real world experience would tell you are an impressive driving experience over ICE.
This is just more bashing.

TeslaPlease

Inferiority Complex Sufferer?

1) Proof of CU bashing – dare you – you have nothing
2) Criticism of poorly engineered AutoPilot is not a criticism of EVs but semi-autonomous technology – not working well
3) CU has spoken well of Teslas when the praise was earned
4) If you are looking for a safe zone for Tesla worship consider YouTube. Plenty of toadies there – enjoy.

mx

I watch CU’s reviews of all EV and Hybrid models, suggest you do too.

Anyway, Today’s a good time to Buy Tesla Stock, at 292.
With Tesla Hitting 3000.

https://www.teslarati.com/tesla-model-3-production-ramp-3k-week/

scott franco

Its because Elon is there sleeping on the line. The workers know that if they increase production to 5,000, Elon will go away…

Pushmi-Pullyu

TeslaPlease said:

“Inferiority Complex Sufferer?”

You must be describing yourself, troll. Seriously, you must have a very low opinion of yourself if you can’t find anything better to do than troll InsideEVs comments.

The facts of the case:

One only has to look at the two articles where Consumer Reports first dis-recommended buying the Tesla Model S, and then a few months later re-recommended it, for no reason other than Tesla had delayed the over-the-air upgrade of their AEB (Automatic Emergency Braking) system to work at faster speeds, to see how obvious it is that CR uses Tesla’s name and popularity to get undeserved attention for their magazine and sell more subscriptions.

Here is InsideEVs’ coverage of the way CR repeatedly flip-flopped on that:

https://insideevs.com/tesla-model-s-regains-top-rating-consumer-reports/

That is only the most obvious example of CR making a mountain out of a molehill about Tesla’s cars; this latest case of foot-in-mouth from them is another.

If CR really cared about saving lives, and safety, instead of caring only about pushing up their subscription rate, then they would be shouting to the hills that Tesla is leading the way in saving lives right now with its Autopilot+AutoSteer!

Go Tesla! Go Autopilot!

Get Real

LMAO, self-interested serial anti Tesla trolls Mental MadBro and TeslaPiss are once again starting their capet-bombing attacks and FUD against any and all things Tesla.

Obviously Tesla needs to, and is in fact constantly improving AutoPilot and all of their software for their cars.

But equally obvious is that safety assists like AutoPilot/AEB/AutoSteer are improvements, albeit incomplete and not perfect ones, for the overall driving population.

Not a day goes by where I don’t see many drivers using their phones for texting/not handsfree phone calls or putting on makeup/eating, etc.

Let’s not make perfect the enemy of good here as traffic accidents and deaths from human drivers is still a very large problem in society and all the systems out there like AP help prevent increasing larger numbers of these tragedies then we all win…except for the trolls who have financial reasons to troll here.

NoDistractions

AP is not going to help with distracted drivers using their phones and texting. It’s only going to increase the problem, because people think they can get away with it.

Nix

AP/AEB/etc protects you against distracted drivers because it spots them before you do and reacts before you can.

Like stopping your Tesla from hitting a bright red firetruck stopped on the highway because you’re Facebooking on your phone and can’t be bothered to look through the windshield because AP is driving the car?

(⌐■_■) Trollnonymous

Personally I hope this prompts Tesla to allow ordering a TM3 WITHOUT any AP hardware or software with reduced cost.
😛

Just my selfish want……..lol

mx

You get the hardware, you can defer the software purchase till later, but, then you’ll pay an additional fee.

I may have to do the same thing, when it’s time to pull the trigger.

scott franco

Sure. Say goodbye to your fed tax credit.

Don’t worry, I am getting yours…

TeslaPlease

I think the Model 3 is a beautiful car design that could have been perceived as a unqualified breakthrough had Tesla not been so rushed and brought to market too soon.

– No mandatory Autopilot HW / SW
– More time spent refining the UI / UX
– Heads up display option with enhanced voice commands

I have not configured because the Model 3 roll-out has been problematic and could be remedied if Elon would step back and let a competent auto industry leader take control of day-to-day operations without resistance.

MTN Ranger

I agree for the most part.

I did order my 3 without EAP. It’s just not ready for prime time and not worth $5k in my opinion. If it was $1k or $2k , maybe I would get it for my driving needs.

I need a new car now and have already waited two years. Most of the software bugs will be ironed out with updates. The misaligned body panels need to be addressed. I hope with all the possible quality control issues, I get some lottery luck when mine is delivered in the next couple weeks.

NoDistractions

Agreed. I would love to pay thousands less without all that crap in my car.

Doggydogworld

The recent CBS video clips include one where Elon is driving Gayle King around on autopilot. When he shows how AP warns him to keep his hands on the wheel she asks “if you have to keep your hands on the wheel then what’s the point”?

Bingo!

philip d
A similar analogy could be made to cruise control. Why have cruise control if you can’t put your feet up on the dash? Because cruise control takes the burden off during long drives of having to make multiple small adjustments every second by your foot on the accelerator to maintain the desired speed for hours on end. While using cruise control in moderate highway traffic I still keep my foot hovering over the accelerator pedal just in case I need to respond in an instant. But I’m not making these little adjustments over and over and over again endlessly for hours on end. And when traffic thins out for brief periods I do put my foot down on the floor and rest it for awhile as long as it feels safe and there are no cars directly around me. This is exactly what autopilit is designed to do but for not just maintaining speed but steering and keeping distance from the car in front. It takes the repetitive burden of your accelerator foot just like cruise control but goes further to take the burden off your hands having to make many tiny adjustments every second to keep the car going… Read more »
scott franco

I’ll tell you a story about that. I was flying back from Mexico over the open desert coming back to the USA. Mexico is not like the USA. They have suppressed private aviation quite well, and the chances of running into another airplane in daylight are virtually nil.

I was on autopilot with the seat back, legs crossed and watching the scenery go by. I realized I had left the food back in baggage, and was getting hungry. So the idea occurred that I could go back to the second row seats, and get the food, and be back quickly.

However, I started to visualize what might go wrong. Just unbuckling the seatbelt is potential issue. If, for example, the plane stalls going straight up, you are falling down into the tail, You are not getting back to the controls before you are dead.

I did manage to do it, but it involved planning it out and executing it VERY QUICKLY.

I don’t trust autopilots. Never will.

Nix

You failed to follow FAA mandates for Autopilot and should have your license pulled.

https://www.faa.gov/regulations_policies/handbooks_manuals/aviation/advanced_avionics_handbook/media/aah_ch04.pdf

Safety 3rd

I think it is too early in autopilot development to depend on an owner’s response time in an emergency. For example; in the last accident it would appear autopilot did not sense the danger of a freeway obstruction in time to stop the car or to warn the driver to take over. As an old race driver, I fight against all the distractions of driving a car, i.e., cellphones, an offset dash, touch screen controls, etc. At no time should you not watch the road and hold the steering wheel at 9 and 3.

Pushmi-Pullyu

Autopilot is not designed to give off a warning if it detects a stationary obstacle. If it was, then it would literally be giving off that warning constantly, so the warning would be useless.

Anyone who doubts that is true should watch this official Tesla AutoSteer video, and watch the “side view” windows to see how many hundreds of trees and other stationary objects (outlined in green) which Autopilot mis-identifies as being “in path” obstacles, despite clearly being well to the side of the road or even behind the car!

https://www.youtube.com/watch?v=hLaEV72elj0

CDAVIS

Consume Union (Consumer Reports) said: ‘…Tesla should commit to put safety first—and to stop using consumers as beta testers for unproven technology… Tesla should improve the safety of Autopilot without delay…”
——————

The best way to improve Autopilot “without delay” is to optionally make available a reasonably well tested version of Autopilot to consumers for further real-world consumer beta use/testing to determine how the system can be further improved. That’s what Tesla is doing.

The best way to improve Autopilot with a very long delay (and likely not consumer avail for 10+ years if ever) is to make available a proven never-will-fail tested version of Autopilot to consumers. That’s what Tesla is not doing.

It’s one or the other… the 1st option resulting in fewer net road accident deaths.

Seatbelt and airbag technology has over the years greatly improved and saved many lives although to this day there are fringe cases where seatbelts and airbags do not operate as ideally intended. We likely would not have seat belts and airbags in cars today in they were not allowed to be added to cars until it could be proven that they would never fail.

TeslaPlease

GM with Cadillac SuperCruise has shown there is a third option that is less risky, adds LIDAR, and monitors the driver by camera for use on rated highways.

I much prefer this approach for me and my family.

philip d

Great. So why can’t we each choose the product we want? As long as it’s clear to the consumer what each product’s limitations are. Just like how Tesla makes it very clear now with it’s on screen disclaimer on how and where it’s appropriate to use autopilot.

TeslaPlease

‘Great. So why can’t we each choose the product we want?’

You don’t drive on the road by yourself – that’s why. I strongly feel these technologies should be risk rated against a common standard so the public, insurance companies, and attorneys have the necessary information to affix blame/liability more accurately.

I’m not a fan of Tesla’s AutoPilot since the separation with MobilEye and won’t risk my families safety using it.

Consumer Reports is on the right track.

Nix

It very well may be that GM’s system is failing to save as many lives as the Tesla system by not being engaged more often like the Tesla system.

This is the classic Air Bag conundrum. We know for certain that SOME deaths can be attributed to an airbag going off where the accident would likely not have ended in a death. To the uneducated, they may believe that the answer to avoiding those deaths would be to turn their air bags off. But that fails to account for the vast majority of deaths that are avoided by having the airbags on.

The same goes for every single safety system, including seat belts, crumple zones, air bags, ABS, and autopilot.

Rick Rowling

I’m pretty sure that Super Cruise does not come equipped with LIDAR. However, it does use high-resolution maps created with LIDAR.

Yes. Guaranteed a Supercruise-enabled GM car won’t steer your car directly into a concrete barrier on the highway.

Terawatt
Releasing 100% of the data would accelerate development in itself. In any case, it’s not just a question of what gets us there fastest. It is very clear that many are relying on the automation to an irresponsible degree. While you can argue all day long that this is the moral responsibility of the driver, it’s also clear Tesla it’s not doing as much as they could to discourage such behaviour. The system could be much more intolerant of taking one’s hands off the wheel, for example. IMO it is more dangerous to take your hands off the wheel with AP than without, because the software may actively make the wrong choices on occasion, requiring a faster and larger correction from the driver than directional changes induced from the road surface or a puncture. AP may indeed already improve safety. But in my view is also plausible that it does the opposite. For example, it may help prevent many minor accidents that otherwise would have happened (say, when changing lanes on the highway, all cars traveling in the same direction with relatively small speed differentials), but at the same time be a factor in extreme crashes that would never have… Read more »
Pushmi-Pullyu

“AP may indeed already improve safety. But in my view is also plausible that it does the opposite.”

If it wasn’t for the fact that the NHTSA has publicly stated, as fact, that Tesla cars with Autopilot+AutoSteer installed in them have a significantly lower accident rate than those without, then I might agree with you.

As it is, though, we have very clear data from an independent and authoritative source that AutoSteer does significantly reduce the accident rate. In fact, if you consider that many of those cars with AutoSteer don’t have the system turned on, then using AutoSteer must lower the accident rate by even more than 40%!

https://electrek.co/2017/01/19/tesla-crash-rate-autopilot-nhtsa/

Rick Rowling

AEB was released several months before Autopilot, so it’s probably responsible for much of that reduction. The IIHS found a 40% reduction in rear-end collisions due to AEB alone:
http://www.iihs.org/iihs/news/desktopnews/crashes-avoided-front-crash-prevention-slashes-police-reported-rear-end-crashes

Nix

1 out of 3 of the Tesla’s in the NHTSA study had data from when they were AEB only, before they had Autosteer enabled via OTA update.

They showed the same 40% improvement in accident rates.

YOU WERE EFFING FED THE DAMN SOURCE THAT STATES THIS CLEARLY BY THE PREVIOUS POSTER YOU RESPONDED TO!!!!!!!!

You folks can’t handle the truth when it is handed to you on a silver platter.

Not when the “truth” is “served” by an obsessed TSLA fanboi cultist with ulterior motives.

Klaus

Didn’t Tesla’s emergency braking feature just get an update so it works up to 90MPH now? Maybe autopilot should only engage when the emergency braking feature is active.

Pushmi-Pullyu

Your wish is granted. AutoSteer only works up to (if I recall correctly) 88 MPH.

Klaus

So, why didn’t emergency braking kick in when the model x hit the barrier in the last fatalit? I would expect the car to detect the collison and brake. My car has a similar feature and it will auto brake in those situations.

Because Tesla released an unfinished version of AP and was using its customers as guinea pig crash test dummies. I believe the death count for the Tesla guinea pig beta testing program is up to 4 now? How many more deaths will it take to perfect AP?

Koenigsegg

The driver failed to comply with multiple warnings on the HUD to take the wheel.

It is the drivers fault.

Tesla remains the safest.

Terawatt

What you are arguing amounts to this: it doesn’t matter how human beings actually behave, only how they SHOULD behave is relevant.

Unfortunately this is the kind of thinking that only justified indignation but doesn’t produce the outcomes we want.

I for one care how the systems actually make people behave. A car going blindly down the highway because the driver is irrationally trustful of a system is just as dangerous to me regardless if it’s the drivers fault or Tesla’s fault.

How many fatalities have there been in a Tesla? And in how many of the cases is AP a prime suspect in explaining how the driver could have failed to act? I don’t have any numbers, just an impression. But it seems to me quite premature to conclude anything at all about whether AP actually is safer than no AP at this point. What seems clear is some horrendous accidents have happened that very likely wouldn’t have if the driver had no software driving his car to irresponsibly rely on.

JJG

“What seems clear is some horrendous accidents have happened that very likely wouldn’t have if the driver had no software driving his car to irresponsibly rely on.”

Passing the buck of responsibility. Pretty much have society works now. It starts with an irresponsible driver and ends with it. If they are bad drivers with software assistance, how bad were they without it?

And how do you know how many accidents have been avoided because the software is installed? Personally I’ve seen more videos showing accidents avoided than reported fatalities.

Pushmi-Pullyu

Your armchair philosopher reasoning, and disdaining to look at the actual evidence, reminds me a lot of “The Fable of Plato’s Horse”, in which the ancient Greek philosophers spent many long hours in a heated argument over the number of teeth in a horse’s mouth. Actually going out into the street and examining an actual horse to count the teeth was “beneath” them.

https://www.mwls.com/anecdotes/anecdote.php?aid=plato

And you’re just as wrong as they were! The evidence is extremely clear that in a Tesla car, you are a lot safer with AutoSteer in control than with a human in control. Obsessing over two fatal accidents with AutoSteer in control ignores all the lives saved by AutoSteer!

https://www.geekwire.com/2015/watch-teslas-new-autopilot-technology-prevents-accident-on-slick-seattle-road/

gary

Tesla’s only claim is that vehicles equipped with Autopilot hardware have fewer fatal accidents. They have stated nothing about the accident rate with Autopilot engaged.
“If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.”
What does that mean? Who knows? But it would be interesting to see some statistics on accident rate (fatal or not) with Autopilot engaged and not just with the hardware for it in the car (and not in use).

gary

wanted to add:
Tesla’s are some of the safest (arguably THE safest) cars on the road, so one would expect the fatality rate to be significantly lower based on that fact alone.
How much safety Autopilot brings to the party can only be done by comparing apples to apples. Release the statistics for accidents with and without Autopilot engaged if you want to claim it contributes to the safety of your vehicles.

Nix

Actually, everybody who has read the NHTSA report that has been posted and reposted ad nauseam by multiple posters does a very good job of explaining all of that. Links are also available right here in the insideevs story archives.

The NHTSA report that doesn’t contain any of the raw data that their conclusions were based on, that many of us have been clamoring 4. Yes, that report.

I just read something that said the fatality rate of Tesla owners using AP2 is 90% higher compared to other plug-in car owners using their brand’s semi-autonomous driving tech.
Hey, a sourceless, data-less claim! So easy to do!

scott franco

“If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.”

Do you know that %38 of statistics are just made up on the spot?

josealb

So it was that easy all along? Tesla just needs to make the system better?

Thank you lawyers for showing engineers the way

D

“However, the automaker believes that the drivers should have been paying attention.”

For the hundredth time, the protects Tesla legally, but humans will be trained into completely losing attention after the 200th example of autopilot doing its job, and absolutely nothing will force them stay as alert as they would have been if they were driving on their own.

Tesla says on average it is safer, but we are aware of such phenomenon as fallacy of averages (one can look it up)?
That means it will make a drunk driver safer, but imo a normal human driver in a peak or average alert state will be less safe.

D
Also the fallacy of averages means, simply speaking, if i look at the average human fatalities, and ask the question whether this driver was in physical condition i normaly drive in? So if I never drive drunk and i avoid aggressive driving and have “good driver discount” etc. etc. Obviously for myself i need to take rates once i filter cases that are not in my group (drunk driving accidents, people getting speeding tickets on regular basis etc. etc.). I bet there will be a lot of cases i can exclude based on behavioral group match. In Tesla’s cases, I don’t think I can say the same. These cases involve humans under normal, alert conditions. Most likely my behavioral group’s. Doesn’t seem rush hour either (in bay area, one literally just crawls in a queue during rush hours, so it would be hard to get to such highway speeds as 101 accident, if it were one of those circumstances of extra driving stress). Yet these unfortunate Tesla drivers were distracted to the degree of not seeing what was many seconds ahead. I think I have reasons to believe my type of driver driving AP would be endangered at rates exceeding… Read more »
Ambulator

The Tesla crash on the 101 was an understandable error because the lane marking were not up to standards. However, when the road got unexpectedly rough before the crash the car should have slowed. At the very end an emergency braking should have started. The car would have been totaled but the driver would probably have lived.

That is the behavior I would want to see before I’d trust it.

scott franco

I drive like crap and I still have a “good driving discount”, so there.

Pushmi-Pullyu

“That means it will make a drunk driver safer, but imo a normal human driver in a peak or average alert state will be less safe.”

Drunk…

or high…

or sleepy…

or texting on a cell phone…

There are several types of significantly diminished capacity driving where Tesla Autopilot + AutoSteer will be far safer than the average human driver. “In 2015, 10,265 people died in alcohol-impaired driving crashes, accounting for nearly one-third (29%) of all traffic-related deaths in the United States.” (source below)

Plus, as I think we all have heard, the fatal accident rate has actually been going up in the USA due to increased numbers of people texting while driving!

But the most important statistic we know about, relating to whether or not you’re safer with Autopilot+AutoSteer than without it, is that there is an overall ~40% lower rate of serious accident (measured by airbag deployment) in Tesla cars with Autopilot+AutoSteer than in Tesla cars without AutoSteer installed. I don’t think it’s either logical or reasonable to try to dismiss such a large disparity as a mere “statistical accident”.

source:
https://www.cdc.gov/motorvehiclesafety/impaired_driving/impaired-drv_factsheet.html

scott franco

Actually, the accident rate has been going up at the same time the fatality rate has been going down. Cars are better at surviving crashes, and drivers have become more complacent as a result.

D
“But the most important statistic we know about, relating to whether or not you’re safer with Autopilot+AutoSteer than without it, is that there is an overall ~40% lower rate of serious accident (measured by airbag deployment) in Tesla cars with Autopilot+AutoSteer than in Tesla cars without AutoSteer installed. I don’t think it’s either logical or reasonable to try to dismiss such a large disparity as a mere “statistical accident”. (end of quote) I think you have missed my point since this is the same appeal to averages as i have been discussed. I want to know who this will look like once we remove impaired driving from consideration (non-impaired driving group). I explained that my concern stems from the fact that human is trained into losing attention by AP, therefore we really need to consider performance in the non-impaired group. And the fact that Tesla fatalities are within non-impaired group, seems to give that line of though a bit more fodder to consider. Problem at the numbers we have, rate test will not yet reject within a good measure of confidence (I do stats for living), and the human driver non-impaired group accurate data are not readily available (to me… Read more »
Nelson

Auto pilot should safely bring the car to a stop on a shoulder if it finds a driver is not paying attention or unresponsive. If that occurs Tesla should be notified as well as local authorities.

NPNS! SBF!
Volt#671 + BoltEV + Model 3 (soon)

Ambulator

The AI that Tesla uses isn’t smart enough to stop on the shoulder. (Shoulder? Or is it a drop off?)

JJG

The car will starting aggressively braking after enough warnings.

Pulling over brings in all sorts of variables.

What if there is no shoulder?
If the AP sensors have lost it’s vision how will it safely pull over?
Is it going to pull over multiple lanes on an interstate, particularly on high traffic areas?

scott franco

Right. The AP algorithm is to brake for oncoming objects while holding straight. Avoidance by lane switching is just not a good program. If you get rear ended, its the fault of the driver behind for following too close.

Pushmi-Pullyu

“After another tragedy involving Autopilot, Tesla should commit to put safety first—and to stop using consumers as beta testers for unproven technology.”

This is stupid. Will they also call for people to turn off their airbags because some 20 people have been killed in accidents related to exploding airbags?

Not only no, but hell no! The question is whether you are safer with a safety-critical system, or without it. NHTSA has officially stated that Tesla cars are safer with Autopilot+AutoSteer; Tesla cars with AutoSteer installed have a 40% lower accident rate than those without.

Shame on Consumers Union! And shame on Consumer Reports for repeatedly using Tesla’s name to gin up controversy in their articles, just to promote their magazine sales. If CR was consistent in their reviews of Tesla’s cars, that would be one thing. But they’re not! They first publish an article denigrating Tesla cars saying they are “not recommended”, then they switch positions, giving themselves a flimsy excuse for flip-flopping, and publish another article saying the cars *are* recommended.

Rick Rowling

AEB was released several months before Autopilot, so it’s probably responsible for much of that reduction. The IIHS found a 40% reduction in rear-end collisions due to AEB alone:
http://www.iihs.org/iihs/news/desktopnews/crashes-avoided-front-crash-prevention-slashes-police-reported-rear-end-crashes

AEB can also reduce crash severity, so some collisions may not result in airbag deployment.

Nix

1 out of 3 of the Tesla’s that NHTSA studied had AEB enabled before they had autosteer enabled via OTA update.

They observed that the 40% number was ABOVE the AEB that had previously been enabled.

They had same-car, same-driver statistics that showed the improvement was specifically from autosteer. You will never believe, even when handed the stats on a silver platter. Which you were handed earlier.

NoDistractions

NTSB and NHSTA working with SAE should establish a standard set of protocols for testing these systems and establish a standard for how the systems are engaged and disengagement.

Nix

So no car maker should have ever installed seat belts or air bags or abs or etc before the gov’t had established standards?

Sorry, but Tesla’s system is saving lives RIGHT NOW. Waiting for dithering agencies or organizations to get their act together is not an option. All those folks could have done all that BEFORE Tesla introduced Autopilot. They didn’t. It isn’t like Tesla suddenly sprung autopilot on the world with zero notice.

Waiting to save lives for the slowest organizations to act would result in MORE people dying while waiting for their action. Waiting for perfect == more deaths while waiting.

HVACman
Good for CR to demand specific proof. Tesla has evolved a very clever PR group that has mastered the art of mis-direction -by-statistic. They respond with truthful numbers that don’t actually address the issue, but sound close enough to fool most who read it into thinking it did. Example: for the latest accident, Tesla responded: “In the US, there is one automotive fatality every 86 million miles across all vehicles. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident and this continues to improve” So, yes, ALL Tesla vehicles likely do have a fraction of the death rate that the national average “across all vehicles” does. But so do many other other new vehicles – indeed – it it likely that most vehicles made in past few years (with crumple zones, air bags, etc.) have much-better-than-average statistics. For example, using IIHS driver death-rate by make/model for the 2014 year (latest available) the “average” driver death rate was 30 per million registered vehicles. But when looking at… Read more »
HVACman

partial mea culpa. Pushmi-pullyu is right about that there is a valid 40% reduction statistic. I violated my own rule and failed to read every word of the release and missed the NHSTA reference. I found the original NHSTA report and confirmed the results, but it was about crashes, not death rate. The data was based on crash events that triggered air bags with Tesla vehicles before AP and after AP were operating.

1.3 crashes/million miles before. 0.8/ million miles after.

See Figure 11 on linked pdf below:

https://static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.PDF

If you read the complete report, you also find that driver misuse of AP is a growing concern with the NHSTA.

And that may be involved in this specific crash.

Pushmi-Pullyu

Yes, if we do a deep dive, we see that the NHTSA’s claim of a ~40% lower accident rate is due to lower rate of airbag deployment. I think it’s safe to assume Tesla has accurate statistics on airbag deployment, so that should be a reliable figure. So to be precise, we should specify “A ~40% lower rate of accidents serious enough to deploy airbags.” A nit-picker could point out that there may be (hopefully rare) cases of airbag malfunction where one went off without any accident being involved, but I think it’s safe to assume the occurrence of that would be below the margin of error for that ~40% figure.

I hope I have never claimed a ~40% lower fatal accident rate when using Autopilot+AutoSteer; that would be wrong. Hopefully I’ve always been careful to say only a ~40% lower accident rate. A lower fatal accident rate (death rate) is pretty strongly implied by a significantly lower serious accident rate, altho it may well not be close to ~40% lower.

Rick Rowling

AEB was released several months before Autopilot, so it’s probably responsible for much of that 40% reduction.

Ideally, I’d like to see a study that compares accident rates under similar conditions. Drivers will tend to use Autosteer under more favourable conditions (i.e. when visibility and traction are good), so that’s a factor to consider.

Somewhat counter-intuitively, I wonder if Autosteer (and similar systems) would be safer if it were less smooth, and “ping-ponged” a bit left and right. That would probably motivate drivers to supervise the system more closely, and would discourage driver misuse.

Nix

There is a direct relationship between air-bag deployment and fatal accident rates. The NHTSA study only counted accidents sever enough to trigger an airbag deployment. Those are the same types of accidents where the likelihood of deaths are highest. So while it isn’t a direct relationship, there is a correlation

NoDistractions

Well done. Head of the class. Gold Star and pat on the back.

Critical thinking and the ability to understand studies and flawed conclusions are missing in today’s education system, but you nailed it.

Anti-Lord Kelvin

If the sample size was 100.000, there were not 100.000 Tesla cars in US in 2014, and none of them had Autopilot activated (only hardware for a very few of them).
I’m surprised, really more than 100.000 Audi A6 in US in 2014, or more than 100.000 Mercedes Class E, or specifically BMW 535i?

Nix

IIHS data does NOT isolate vehicle safety. It also includes data on how safe the DRIVER is. There is no way to separate IIHS data from the driver. Drivers who self-select to own sub 3 second cars are likely to have more aggressive driving habits. Insurance companies spend millions of dollars to figure out which drivers are a lower risk than others who drive the same cars, because they understand the risk variable that the driver represents in these statistics.

But you know this, because it has been pointed out to you REPEATEDLY, and yet you keep repeating the same meme.

Bloggin

This is just a media grab by Consumer Reports once again, because anyone who actually ‘owns’ a Tesla with Autopilot knows exactly how it works and the requirements.

It’s just the general public that has a distorted perception of the technology, mostly based on distorted reporting by ‘media’ that does not know what they are talking about either.

Tom Dually

Well, yeah, some of the surviving Tesla owners do anyway…

Pushmi-Pullyu

Well, yeah, some of the drivers who are alive today, and some who have avoided serious accident and so are still able to drive today, are aware that they owe their continued state of health to Autopilot+AutoSteer saving them from a serious accident.

Others, who were too drunk or sleepy or too busy texting on their phone to notice, may be blissfully unaware that Tesla Autopilot+AutoSteer either saved their life, or kept them from having to spend weeks in a hospital.

Not all of those drivers are Tesla drivers, either. Some of them were in non-Tesla cars which would have been involved in an accident with a Tesla car, if Autopilot+AutoSteer had not taken action to avoid the accident.

https://www.geekwire.com/2015/watch-teslas-new-autopilot-technology-prevents-accident-on-slick-seattle-road/

NoDistractions

Ouch. LOL.

Nix

The number of surviving Tesla owners is higher than if there were no autopilot. Thanks for pointing that out.

TeslaPlease

You’re 100% Right – How Dare People Question the Sound Judgment of Tesla Owners.

Hacking Tesla Autopilot | ORANGE TRICK
Tesla Autopilot Road Trip – Dancing
Tesla Autopilot Road Trip – Playing Cards
Tesla Autopilot Road Trip – Arm Wrestling
Tesla Autopilot Road Trip – Reading
Tesla Autopilot Road Trip – Having Picnic
Tesla Autopilot Road Trip – Workout
Tesla Autopilot Road Trip – Sleeping
Idiot EATS BURGER with TESLA AUTO PILOT in CANYONS

Nix

Stupid people do stupid things with their cars all the time. Just go to youtube. Are you suggesting that Tesla and Tesla alone must make their cars stupid-proof?

Heck, if this board could avoid stupid posters, you would already be gone. But even this place isn’t stupid-proof. Sorry.

scott franco

A start would be stop calling it “autopilot”. It is neither “auto” nor “pilot”. Adaptive cruise control was perhaps too limited, but “driver assist” says it all.

This is going to end up as another “cell phone” debacle over again. Outlawed for everyone because there are stupid people in the world. Why not just outlaw stupid people?

Anti-Lord Kelvin

Edmunds’s next article should be “Why driver assistant systems still fail? Because humans, after one hundred years of driving, are still in beta!”

JJG

I strongly lean toward the last fatality being a suicide.

The driver was experienced with AP. He knew the route well and knew it had issues where the accident happened. He even went of of his way to tell his wife about it beforehand. Yet for some reason he is grossly negligent controlling his vehicle on this stretch of road?

His family has a case now with a chance for a large settlement. It’s cold and just a theory but fits the facts the best.

Pushmi-Pullyu

Or maybe he fell asleep, or was drunk or high, or was otherwise distracted for a critical several seconds.

Not only have you jumped to a conclusion far beyond the available evidence, it absolutely is not the most likely scenario. Occam’s Razor does not shave in the direction of suicide.

Did the police check to see if he was using his cell phone at the time of the accident? Do they normally do that as part of a fatal traffic accident investigation? Quite possibly not. It will be interesting to see if the NTSB does that as part of their investigation.

Tom Dually

Cadillac’s SuperCruise is a far more effective system for the real world.

the recent comparison video of the two systems was illuminating. AP often “drives like a drunk” and does occasionally “lurch” around in trying to find a line to follow.

Pushmi-Pullyu
If Cadillac’s semi-self-driving system uses active lidar scanning, as another comment here suggests, then it may well be safer than Tesla’s Autopilot+AutoSteer system. At least, that gives the potential for safer semi-autonomous driving. But then, that in recent accident where a Uber car hit and killed a pedestrian walking her bicycle across the road, the Uber car had active lidar scanning, which didn’t prevent the accident. Active scanning with lidar or high-res radar gives a better tool for self-driving cars. But it’s up to the developer of the self-driving system to use that tool properly… or not! Tesla’s AutoSteer allows the car to wander somewhat within a lane, and tends to change direction during lane-centering more abruptly than humans do. Similar systems from other auto makers, quite possibly including Cadillac, perform smoother, less abrupt lane-centering. But the fact that humans sometimes find the more abrupt lane-centering performed by AutoSteer to be “lurching”, and thus alarming, isn’t actually an indication it’s less safe. If Cadillac’s cars have smoother, less abrupt lane-centering, then that may make someone feel safer, but that alone doesn’t mean they are safer. If GM really has developed a semi-self-driving system that is safer than Tesla’s Autopilot+AutoSteer, or… Read more »
David

Pushmi-Pullyu Exactly how many people have you branded as a troll? Most I’ve ever seen in any forum. Just because anyone says anything not glowing about Tesla does not make them a troll. Constant name calling diminishes the credibility of any argument that may be attempted.

Another Euro point of view

PuPu is surrounded by armies of trolls and fudsters. They are everywhere you see…. All of them shorting Tesla. I am sure he has a look under his bed before going to sleep to check for Tesla haters 🙂

Will

😂😂😂😂😂😂😂😂😂

David

Btw. Super cruise does not use lidar. The route was mapped with lidar but there is no lidar on board.

wavelet

NOONE at this point uses LIDAR for any driver-assist system on a production car. They’re simply too expensive (current cost $50K-$75K for EACH car), and very few are made. Since current models are physical arrays that revolve, they also require non-trivial maintenance/calibration.
Lots of startups working on reducing the cost — they hope to several $100 eventually — but that’ll take quite a few years.
This is why Musk speaks against LIDAR, and Tesla claims it isn’t necessary. He doesn’t have anything against it in principle, but understands it’s simply not affordable in the near term, and believes/hopes camera are enough (sonar is only good for very short range, like parking, and radar by itself isn’t good for RF-permeable materials). That hope has yet to be proven, and most experts don’t actually think Musk is right on that.
IMO, Tesla’s selling cars that it claims contain HW sufficient for “full self-driving” (SW to be added later) is non-ethical and borderline fraud.

Bottom line, LIDAR is currently _only_ used either for mapping vehicles (the ones that create the detailed maps) or for R&D vehicles by the various SDC-developing companies. There are very few such vehicles.

TeslaPlease

WRONG……

Audi A8 Uses Lidar
There’s so much tech on the A8, it’s hard to know where to start.

Around the vehicle are a dazzling number of sensors, including long-range radar, four mid-range radars, 360-degree cameras, up to 12 ultrasound cameras and, for the first time in a production car, LIDAR. That’s a lot of input data, all of it crunched by a central “zFAS” driver assistance controller. There’s also an NVIDIA K1 processing unit to handle the infotainment chores.

Bill Howland

The big mistake deceased Tesla owners have made is that of thinking that the $5000 “Autopilot” they paid for was actually anything remotely resembling an autopilot.

More commenters here are coming over to my way of thinking.

Nix

Idiots like you have a problem that you don’t actually understand what autopilot does in an airplane.

If you bothered educating yourself about autopilot, after REPEATEDLY being provided the same education over and over, you would realize that airplane autopilot has all the same restrictions as Tesla autopilot — if not more!!!

https://www.faa.gov/regulations_policies/handbooks_manuals/aviation/advanced_avionics_handbook/media/aah_ch04.pdf

Sadly, no matter how many times you are spoon fed reality straight from the FAA you refuse to learn

Mark W

They should be required to change the name from Autopilot. It’s great that they are developing the technology, but being callled Autopilot gives the impression that the car drives itself.

NoDistractions

It doesn’t help that fanbois youtubers post videos of the system doing miraculous feats. Or fan sites like Electrek promoting it is a self driving system. The media deserves some level of blame for these deaths,just like they deserve blame for the current president of the US.

Nix

Maybe you should first learn the restrictions of autopilot in planes, and then you wouldn’t be confused

https://www.faa.gov/regulations_policies/handbooks_manuals/aviation/advanced_avionics_handbook/media/aah_ch04.pdf

wavelet

The German Transportation Ministry warned Tesla about the name 1.5 years ago, and asked them to change the name. They even went so far as writing all Tesla drivers and reminding them that it’s not an autonomous system.

Tesla’s response: Laughing it off, in a response that shows the response’s author doesn’t even speak German.

https://arstechnica.com/cars/2016/10/tesla-must-not-use-the-term-autopilot-germany-says/

przemo_li

BS pure and simple.

CU need to up their game on statistics.

“Dangerous” Autopilot is still better choice if it performs better then unassisted human driver.

There is no “too many accidents” or “too many death” to force turning Autopilot off, as long as those numbers are less then for human drivers.

Yes you read it right. Autopilot can cause thousands of deaths, and still be morally correct tool to use if the number is less then what unassisted drivers would cause.

That’s “safety first”.

So “limit autopilot now!!!!” is BS.
(Call for more data on reliability of Autopilot, and other assist tools is however good one. There is ZERO value to public in keeping those numbers proprietary.)