Lawyers Chime In On Tesla’s Autopilot Crash Cases

JUL 17 2016 BY STEVEN LOVEDAY 100

Tesla Model S Has Received The Highest Crash Test Ratings Of Any Car Ever

Tesla Model S Has Received The Highest Safety Ratings Of Any Car Ever Tested

It seems that lawyers agree that simply warning drivers to take over when the Autopilot fails would not hold up in a court of law. Just the name “Autopilot” suggests that the car is supposed to drive itself and the hands are expected to be free at times. Automotive liability lawyer, Lynn Shumway, explained to the Automotive News:

“The moment I saw Tesla calling it Autopilot, I thought it was a bad move. Just by the name, aren’t you telling people not to pay attention?’’

Elon Musk Tweets About PA Model X Crash

Elon Musk Tweets About PA Model X Crash

Three recent incidents have occurred with Autopilot “reportedly” engaged. Unfortunately, one was fatal.

Tesla has investigated the other two and found that in one incident Autopilot was, in fact, off. Musk went so far as to say that if the system was on, no accident would have occurred. In the other incident, the technology was being used on a two lane, non-divided highway, and the driver’s hands remained off the wheel. All of these circumstances are against Tesla’s “safe” operation of the system.

The latter is the root of the problem. People will not always follow the rules and will continue to use such technology beyond its means. Putting something mind blowing and potentially life altering in someone’s grasp and then telling them to limit its use, or use it with extreme care, or don’t test its limits, is difficult. Auto Lawyer, Tab Turner, said:

“There’s a concept in the legal profession called an attractive nuisance. These devices are much that way right now. They’re all trying to sell them as a wave of the future, but putting in fine print, ‘Don’t do anything but monitor it.’ It’s a dangerous concept. Warnings alone are never the answer to a design problem.”

People are lazy and most of us “know everything”. Some people purchase these vehicles with the Autopilot technology as a top contender in the deciding factor. They aren’t going to purchase it and then not use it, or use it rarely and with extreme caution. They might as well have opted for a different car. Regardless of any warnings, news, updates, restrictions, etc., people will continue to be people. Think of all of the product warnings that people ignore on a daily basis, with rarely a consequence.

If a case such as the Tesla Model S fatality were to go to court, Tesla could insist that drivers were warned and that in the end the driver is responsible. However, lawyers must only simply find an issue with the technology. If it can be proven that the system is defective, or could have worked better, or may have caused the accident, Tesla or any other company will have no leg to stand on.

Regarding the fatal accident in Florida, Tesla reported that the sensors failed to see the white trailer against the bright sky. Lawyers could argue that it surely should have noticed. Steve Van Gaasbeck, an auto products lawyer in San Antonio, Texas commented:

“It’s great technology, I hope they get this right and put people like us out of business. There’s really no excuse for missing an 18-wheeler.’’

Lawyers for accident victims and families could just aim to prove that the system’s name is misleading and that it doesn’t do as much as possible to explain and remind drivers of its limits. Or that it doesn’t “check” up on the driver’s level of engagement often enough or correctly.

The National Highway Traffic Safety Administration set a 5 point scale for autonomous levels in 2013. It ranges from zero to four, with zero being no automation and four being fully self-driving, with no human interaction.

Tesla’s vehicles are in the Level 2 to Level 3 range. It makes a big difference to lawyers and lawmakers what level a vehicle is or claims to be and what the expectations are of the vehicle as well as what is expected of the human driver.

Currently there are no set industry standards or guidelines from the U.S. government for autonomous vehicles. The process of establishing such rules or laws has begun and is supposed to be released very soon.

Source: Autonews

Categories: Crashed EVs, Tesla

Tags: , , , ,

Leave a Reply

100 Comments on "Lawyers Chime In On Tesla’s Autopilot Crash Cases"

newest oldest most voted

I knew the slimeball “personal injury” lawyers (better known as ambulance chasers) would be all over Autopilot/AutoSteer, and the analogy of “attractive nuisance” did occur to me. Remember, these are the same sleazy lawyers who established a legal precedent that a backyard swimming pool is an “attractive nuisance”, so that if somebody trespasses onto your property and uses your pool without permission, if they drown or otherwise injure themselves, then it’s your fault!

Personal injury lawyers: Establishing that people aren’t responsible for their own actions, and therefore should be treated like children, since at least 1988 (Gregory Roach & Gordon Faulkner v. Para-Chem).

* * * * *

I’ve been commenting for some time now that Tesla shouldn’t have named its driver assist features “Autopilot”, for the very reason given in this article. Maybe it’s just me, but when I see personal injury lawyers making the same argument, I suddenly want to argue the other side!

* * * * *

What’s the difference between a catfish and a personal injury lawyer? One is a slimy, bottom-feeding scum sucker… and the other is a fish.

The first thing we do, let’s kill all the lawyers.

… slowly.

Years ago I read on the internet that America has more lawyers than the rest of the world combined…
Not sure if it is true but it would be far from shocking…

Sorry Matlock, but the attractive nuisance doctrine applies to children, and was developed under common law during the Middle Ages in England. Knowing nothing about a particular subject never stops you from posting your “expert opinion” on the matter. 🙁 The attractive nuisance doctrine “states that a landowner may be held liable for injuries to children trespassing on the land if the injury is caused by an object on the land that is likely to attract children. The doctrine is designed to protect children who are unable to appreciate the risk posed by the object, by imposing a liability on the landowner.” https://en.wikipedia.org/wiki/Attractive_nuisance_doctrine It’s really not asking too much to require that landowners put a gate around a pool to keep a small child from wandering over to the pool and drowning. Likewise, it’s not asking too much for landowners to remove the doors off junk/discarded refrigerators on their property or put out for trash collection, in order to prevent neighborhood children playing hide and seek from suffocating to death. You also shouldn’t paint personal injury lawyers with such a broad stroke. Many people suffer horrific injuries that might include a permanent disability due to other people’s negligence. You’d have… Read more »

it does seem odd for an attorney to invoke an “attractive nuisance” cause of action when the subject is an adult (well, a minor could drive a car, but none of the accidents involved people who were chronological minors). maybe he is suggesting that tesla owners are like children…

Technically, the lawyer didn’t say the attractive nuisance doctrine applied in these situations, just that underlying reasoning for the doctrine is analogous to protecting adults from the lure of misusing this new auto-steering tech. The adult Tesla drivers are unable to appreciate the risk posed of using Autopilot in situations for which it was not designed (hand free or undivided highways).

Lawyers are always trying to expand legal doctrines to cover new or different facts, situations, and technologies. For instance, not long ago lawyers expanded the doctrine of trespass to cover hacking (unauthorized access) into a computer or computer network via landline, broadband, or WiFi.

“trespass” is not limited to trespass to land. there is also an action called “trespass to chattels”, so the cases that you cited, would fall under that category (unless you are suggesting that the actions were brought under a “trespass to land” legal theory).

when it comes to “attractive nuisance”, what cause of action could one bring that is analogous to attractive nuisance as applied to children. the reason why i thought the comment odd is because i can’t imagine any cause of action that you could bring that involved an adult that was even analogous to “attractive nuisance”.

IIRC, unauthorized use of someone else’s WiFi to access the internet fell under trespass to chattel, while hacking into a computer via an internet connection to access files (but not copy or take files) on the hard drive fell under trespass to land. In the past, the trespass to land always required that a person or object had to enter the land. A hacker didn’t physically enter the land to access the computer files, but the courts deemed accessing computer files via an internet connection to be entering the land, a legal fiction, for meeting the elements required to constitute a trespass to land. If the hacker took a copy of the files, then a theft occurred in addition to the trespass.

Statutes have since been enacted to codify the courts’ expansion of the trespass doctrine to encompass computer hacking.

trespass to land requires invasion of real property. since the files would reside on a physical drive, i suppose the “real” property would constitute the physical drive and the access to the physical drive to retrieve files and/or file content would constitute an invasion of the “real” property.

but by that reasoning, access to someone else’s wifi device would also constitute trespass to land because they are invading your physical wifi device.

trespass to chattels only requires interference with a person’s right of possession. so if someone invades your computer to access files, it is conceivably an interference of your right to possession since it interferes with your possessory right to exclude others from accessing the content..

as to “copying”, when you access the files, you are at least making a local copy of the content being viewed even if you aren’t copying the entire file. by that reasoning, access and “theft” should go together.

You don’t need to be a lawyer. You just need to be a Human Factors engineer to realize that the Tesla approach to autonomy is pretty s***ty. Here is a good video that illustrates why good, intuitive design is not just for children.

Along those lines, it is too bad that the yahoos in Silicon Valley didn’t simply drive up the coast to Washington state and talk to the guys at Boeing. There was a serious incident a couple of decades ago when an airplane engine failed and the autopilot had reconfigured the rudder, etc. so that it was flying on one engine. It did so without informing the pilots. So, when they took autopilot off for landing, the plane took a death spiral until they finally figured out what was happening and corrected for it.

Human Factors engineering is serious stuff. It’s not an after thought, especially when your life is literally on the line.

Dan, Nix below states facts any correctly trained, and experienced pilot will know, but since they stopped making the cockpit access able on even longer flights, even less people realistic there is always a plan to have at least one pilot alert, and tightly monitoring the flight, and how the autopilot is doing, so as to take over should the AP fail or the other pilot hit the Restroom or becomes sick. There is only one drivers spot in a car, not two, as there are generally in an aircraft, and still – there gave been inflight and ground contact incidents! ( Even Fatalities! Have they told Boeing, Air Bus, or anyone publicly – they need to remove AP film the flight package on board their aircraft?)

AP is too confusing, limited and dangerous right now to be beta tested on our roadways. Consumer Watchdog is asking for support to ensure that self driving cars are fully tested and manufacturers take full responsibility before these cars drive on public roads. Supportive people can write to the president and voice their concern in the link below, asking for a methodical deployment of such cars. Read more here:
http://www.consumerwatchdog.org/robotcar

Dr. ValueSeeker said:

“AP is too confusing, limited and dangerous right now to be beta tested on our roadways.”

Personally, I’d want to know how many lives Autopilot/AutoSteer has saved before I called for a ban on its use.

“Consumer Watchdog is asking for support to ensure that self driving cars are fully tested…”

I certainly agree that there should be some independent, third-party testing of semi-autonomous driving systems — not just Tesla’s — but calling for the systems to be “fully” tested is pretty naive. Once you pass a certain level of complexity, of multiple variables, then it becomes impossible to test all the “edge cases” or “corner cases”.

Let us say, rather, that there should be a reasonable amount of independent third-party testing, with the understanding that it will always be impossible to test every possible case, and also with the understanding that Tesla’s system is evolving, so will need to be re-tested periodically.

A familiar refrain…

“Cruise control is too confusing, limited and dangerous right now to be beta tested on our roadways.”

“Anti-lock brakes are too confusing, limited and dangerous right now to be beta tested on our roadways.”

“Air bags are too confusing, limited and dangerous right now to be beta tested on our roadways.”

“Seatbelts are too confusing, limited and dangerous right now to be beta tested on our roadways.”

“Automatic transmissions are too confusing, limited and dangerous right now to be beta tested on our roadways.”

“Horseless carriages are too confusing, limited and dangerous right now to be beta tested on our roadways.”

“Sorry Matlock, but the attractive nuisance doctrine applies to children, and was developed under common law during the Middle Ages in England.” Since you obviously didn’t bother reading the article you’re supposedly commenting on, let me help you out here with a quote from the above: “Auto Lawyer, Tab Turner, said: ‘There’s a concept in the legal profession called an attractive nuisance. These devices are much that way right now’.” I’ll just step out of the way and let you argue with an actual “Auto Lawyer”, since you apparently think you know his profession better than he does. “Knowing nothing about a particular subject never stops you from posting your ‘expert opinion’ on the matter.” Good luck, dude, in finding any place where I have ever claimed to be an “expert” on any subject. Here’s a hint: One doesn’t need to be an expert, or anywhere close to that, to know more than you do about many subjects. Do I occasionally not know what I’m talking about when I post? Yeah, occasionally. But I never, ever, post anything which I know not to be true… which is something that has become a habit for you. And when someone points out… Read more »

the name “autopilot” is not the biggest problem here. you had better believe that every tesla marketing blurb, and every elon musk tweet, that promoted the autopilot feature is going to find its way into a litigation as evidence that tesla was negligent, or even worse, reckless, in its product deployment.

So, just to be clear, YOU have “long” argued EXACTLY THE SAME as the “slimeball” lawyers who are “all over” Tesla..?

A more biased commenter would be difficult to find.

A less mature commentor would be difficult to find, among those posting to InsideEVs. Even more difficult would be to find someone with an even more shallow understanding of what he reads.

tort law, while maybe well intentioned, too often strays into loony tunes. it is difficult to get the law right on this because if you make it too difficult for people to sue for damages, corporations will take advantage of it to maximize profits at the expense of the public. to that extent, it is better to accept a few loony tunes judgments than to scrap the system of tort law altogether.

but in the case of tesla, i have noted that tesla has potentially substantial liability with this autopilot feature. you don’t have to resort to loony tunes law to find tesla liable.

What determines the liability? Do you not understand the feature set? From my understanding, the “Autopilot” terminology as employed by Tesla incorporates a SET of features, one of which is called “autosteer”. Although MANY are mistakenly calling the ‘autosteer’ function of the suite of features “autopilot”, Tesla makes no such errors in interpretation and goes to great lengths in its explanations of how the feature set operates, to ensure that others understand its operation and limitations. There’s not much else they can do.

Tesla CANNOT be expected to account for people’s laziness in not wanting to read or ask questions! If the public media mischaracterizes the feature set as a single entity, then I can understand the spread of that mischaracterization on these forums, because many won’t bother to find out about how it actually works. But by not doing so, it turns any cogent attempt at debate into ‘windmill tilting’.

“…it is better to accept a few loony tunes judgments than to scrap the system of tort law altogether.”

I agree that we shouldn’t throw out the entire barrel because there were just a few bad apples in it. But in this case, there is an obviously better solution: To adopt the sort of tort system they use in British Commonwealth countries, where they don’t have this problem with frivolous lawsuits and obscenely high punitive damages in cases where the person suing was very clearly at fault.

We certainly would long ago have had major tort reform in the USA if it wasn’t for the fact that lawyers write the laws.

Agreed – and, a number of anti-tort people’s favorite examples are actually legitimate, like the woman who sued McD’s over getting 3rd-degree burns due to spilling the coffee on herself. McD’s had no business selling coffee so hot that it could cause bodily harm.

And, the fact that the lid said, “caution, contents hot” wasn’t good enough, because nobody expects that it would be hot enough to cause bodily harm. There was no warning against the potential for personal injury.

I can see this sort of logic play out with AP: people know cars are dangerous, and that driving is no laughing matter. However, simply warning people to stay alert and keep your hands on the wheel WHILE THE CAR IS SUPPOSED TO BE STEERING ITSELF very possibly will not be considered a reasonably sufficient warning, based on whatever the legal threshold is.

As I have commented elsehwere, there is no point to autosteer if you’re supposed to keep your hands on the wheel. Utterly idiotic.

“Lawyers chime in…”, I stopped reading right there. Who f-ing cares.

Maybe because they are the experts?

The name of the feature the users enable with a button is clearly called AutoSteer. More importantly *each* time you turn it on (double pull of level) a messags tells the driver to drive with their hands on the wheel. The driver can choose to ignore that. That is the drivers choice just like looking at their phone instead of focused on driving.

View post on imgur.com

When you enable it what you agree with for AutoSteer
“… you need to maintain control and responsiblity …”

View post on imgur.com

Hey, thanks for linking to a screen shot (or perhaps that’s a simulation of one) of AutoSteer opt-in warning screen, scottf200! I’ve looked for such a screen shot but hadn’t found one.

The warning screen says, in part:

“Similar to the autopilot function in airplanes, you need to maintain control and responsibility for your vehicle while enjoying the convenience of Autosteer.”

That rather undercuts the argument that the very term “Autopilot” suggests that you can take your attention off the road, doesn’t it?

You are welcome. It crazy to me how people want to blame the car if the driver takes their hands off the wheel when they are clearly told to keep it there every time autosteer is engaged.

Here is the most current text from my Model X a few minutes ago.

View post on imgur.com

If Tesla “clearly tells” drivers to “always keep their hands on the wheel” every time autosteer is engaged, then why was there such an uproar when Consumer Reports said the following:

“Tesla should disable automatic steering in its cars until it updates the program to verify that the driver’s hands are on the wheel”?

Wouldn’t software updated as Consumer Report suggests just enforce what Tesla “clearly tells” drivers to do when Autosteer is engaged: always keep their hands on the wheel?

Scottf200, as a Tesla owner do you think Tesla should update the Autosteer software to verify that the driver’s hands are on the wheel at all times, as opposed to allowing drivers to keep their hands off the wheel for over two minutes as presently allowed by the current software?

I don’t think it is a simple undertaking to make sure the drivers hands are always on the steering wheel. There are different systems and methods for this. Some require grip and/or some require tension “pulling” on the wheel. Some require grip in multiple places on the wheel. I’m not sure if Tesla is timing it or it is dynamic (ie. based on curves in roads, or how faded the lines are, or is the user frequently not giving enough pressure), or combo of factors. In my case, I drive with one hand sometimes (roadtrip) and most often both hands on the steering wheel … and I *still* get the initial/softer reminder message that pops up. I think it is because I lightly hold the wheel enough to get immediate feel/feedback and that I don’t provide enough tension sometimes. Still I have been looking at the radio or scenery and *felt* the steering wheel doing something my peripheral vision and experience did not feel right so I’ve responded very very quickly. If you have your hands on your lap you only have a visual reference … assuming you are not distracted by the radio or your phone. That is why… Read more »

Thanks for the response.

I recall what you, and others, had said when I got my X a few weeks ago. I was actually bum doped by the delivery specialist who told me there were sensors in the steering wheel. It rapidly became clear that the steering wheel wants a deliberate counter action to how it is acting. Just leaving your hand on the steering wheel does nothing and, in fact, becomes MORE disturbing after it dings at you that you didn’t have your hand on the wheel. When I learned to drive, there was an acronym where one of the letters was ‘K’ for ‘Keep your eyes moving’. Now, one of the places my eyes move to is the dash screen telling me when it is time to use my hand to slightly counter the AP. Thusly, my hands tend to be next to the steering wheel (on my knees), versus on it…and when my eyes see some questionable road paint, they hover over the wheel waiting for it to mess up (but it is usually fine). I keep making references to my marine autopilot…and for this reason, I think the ambulance chasers are washed up. No one is suing marine autopilot manufacturers… Read more »

flmark, you and scottf200 deserve a round of applause from everyone reading this discussion, for contributing actual facts highly pertinent to the conversation. Thanks to both of you for taking the time to post, and for keeping the debate honest!

+1, clapping.

sven asked:

“…why was there such an uproar when Consumer Reports said the following:

“[quote]Tesla should disable automatic steering in its cars until it updates the program to verify that the driver’s hands are on the wheel[unquote]?”

Well anyone can read the comments for themselves and come to their own conclusion. But it seems to me that the overwhelming majority of posts took Consumer Reports to task for presuming to tell Tesla Motors that it knew Tesla’s business better than Tesla does. CR made the assumption — clearly in absence of any real evidence — that, with the current state of the software, drivers are safer with Autopilot/AutoSteer off than with it on.

If Tesla’s claims are right, if even in this relatively primitive stage of Autopilot/AutoSteer development, drivers are safer using it than not using it, then CR’s call for a moratorium may actually cost lives… not save them.

There is no evidence for this at all:

“with the current state of the software, drivers are safer with Autopilot/AutoSteer off than with it on.”

Tesla claim it to be true, but they compare Autopilot miles driven to the total miles driven by other cars. Since Autopilot is only to be used in the safest and least challenging conditions, it is a silly comparison.

Similar to the autopilot feature in Airplane?

So if it deflates you have to take your eyes off the road to blow on it’s tube at the beltline?

Other automakers don’t provide such choice. They require you to keep hands on wheel or lane keep assist will not work.
Tesla knowingly marketed their system as almost autonomous, as far as I heard Musk himself reposted “hands free” driving videos from the person who died afterwards. I can’t really blame the person (RIP) who died in FL, he was brainwashed by Tesla propaganda machine into believing his car can drive on its own and all these warnings are just legal formality. I don’t think he would had left car to drive on its own without looking at road if he would have been aware that it can’t detect cross traffic as was revealed later by MobilEye.

Message that pops up if it does not detect your hands on the wheel

View post on imgur.com

Message and chimes that happen if you ignore

View post on imgur.com

It doesn’t take a lawyer to spot Autopilot as a bad idea, but this quote pretty much sums it up: “Warnings alone are never the answer to a design problem.” Autonomous driving is a dangerous technology when it only intermittently demands the driver’s attention. Even a careful driver who keeps his hands on the wheel and tries to watch the road is at risk because the human mind doesn’t easily focus in and out of tasks. Attention wanders while the car drives itself if immediate attention is required then the driver’s response time is delayed because he must first refocus on the situation before reflexive action is taken. Think about driving in busy traffic. The mind not only sees what is happening at the moment but keeps track of cars in adjacent lanes, cars that have moved into one’s blind spots, etc. It’s a continuous process and one cannot focus on it midstream and be nearly as effective as if they had maintained continuous focus. Assisted driving is different since it still requires constant attention. I suspect we’ll see more and more assisted driving tech introduced over the next 20 years while autonomous driving remains a research goal. Then maybe… Read more »

That’s because people are mis-using autopilot. It isn’t supposed to do all the legwork for you so you can sit back and become unfocused. It’s supposed to mitigate various factors of driving, freeing up mental capacity to pay more attention to OTHER factors.

Ever noticed how when you’re a passenger in the front seat you often spot developing hazards and notice road signs and so on sooner than the actual driver does? This is because a portion of the driver’s brain power is being used to drive the vehicle, stay in lane, etc. As a passenger your brain power is 100% available for simply observing.

With autopilot it’s a similar principle. You’re still behind the wheel ready to take control, but you’re offloading some of the more trivial yet nonetheless mentally fatiguing tasks and freeing up more brainpower for observation.

Of course, people are misusing it entirely and thinking it’s okay to just sit back and do nothing, stop reading the road, stop paying attention. This is clearly foolhardy.

“Ever noticed how when you’re a passenger in the front seat you often spot developing hazards and notice road signs and so on sooner than the actual driver does?”

No. My passengers usually spot scenery before I do.

You’re mixing your metaphors partner. Tesla’s ‘autopilot’ IS a driving assistance suite of features, NOT autonomous driving. Tesla goes to great lengths to make that clear, IF one bothers to check.?

If it’s only driver assistence then why can the driver take his hands off the wheel and take a nap while the car drives itself?

Think about it, partner.

Jacked Beanstalk said:

“Even a careful driver who keeps his hands on the wheel and tries to watch the road is at risk because the human mind doesn’t easily focus in and out of tasks… the driver’s response time is delayed because he must first refocus on the situation before reflexive action is taken.”

I think you have summed up pretty well the additional risk of accident with using AutoSteer. But you are completely ignoring the fact that in other ways, the risk is reduced. Perhaps it’s reduced a bit; perhaps it’s reduced a lot. I don’t know… and neither do you.

Again, the pertinent question is this: Is a Tesla car driver safer with it on, or with it off? We can argue about it until doomsday, but since we don’t have statistically meaningful data relevant to the question, all our opinions are just guesswork.

What we do know is that so far, the accidents caused by Autopilot would not have happened with an attentive driver.

Autopilot is Level 2. Simple as that. The driver can never cede control of safety critical functions. The driver is always in control.

The Google car is Level 3.

Google abandoned Level 3 and is developing Level 4 (full autonomy) as well as “Level 5” (SAE definition, no steering wheel or brakes).

Level 3 definition is a bit fuzzy about the hand-off from autonomous to driver control. This hand-off is where a lot of problems occur (e.g. Tesla wreck in PA, stop-and-go traffic incident in CA).

Maybe there’s a technical solution, … similar to this:

Café Amazon : “Drive Awake” application

… add, that awful squawking sounds like a lawyer ,.. .or some kind of vulture.

News flash: Lawyers suggest suing.

“There’s really no excuse for missing an 18-wheeler.” – said the non-engineer who doesn’t understand the technology.

Last I checked, humans run into the sides of buildings, so I guess we should ban all humans from driving, right?

Not all human drivers perform identically. Unless it was malfunctioning or there was a hardware failure – which isn’t indicated – there’s every reason to expect every Tesla to perform just as badly, missing an 18-wheeler.

In any case that isn’t really the core question. It’s whether human beings are actually aided and safer with AP than without. And that’s an open question for anyone who needs EVIDENCE to decide, as opposed to Mr. Musks assurances.

Seems like this data would be available. How many accidents are typical in 1 million no AP miles vs. the AP miles? I’m guessing the AP miles will win.

here’s how it works:

when a human driver runs into an 18-wheeler, you sue the human driver;

when a company sells a feature that purports to prevent your car from running into the 18-wheeler, and it doesn’t do that, then you sue the company that sold the feature.

of course, when it comes to determining degrees of responsibility, you look at the knowledge and skill of the driver. the greater the driver’s knowledge and skill, the company’s liability is potentially lessened. so the company might face less liability if the driver was a professional test driver with great knowledge of the feature because he would be expected to have known better. on the other hand, if the driver is just some guy who plunked down a few thousand dollars to purchase the feature and he knows little about how it works, then the company’s liability is probably going to be greater.

I don’t recall Tesla promising that.

no comment said:

“when a company sells a feature that purports to prevent your car from running into the 18-wheeler, and it doesn’t do that, then you sue the company that sold the feature.”

We’ll keep that in mind when some auto maker promises that its driver assist (or semi-autonomous controls) will keep you from having an accident.

I think even a lawyer would have a hard time coming up with a perverse enough interpretation of what Tesla describes as the purpose of AutoSteer, to justify that claim.

Once again, how does the last part of Tesla’s warning screen to enable AutoSteer (Beta) read?

“…you need to maintain control and responsibility for your vehicle while enjoying the convenience of Autosteer.”

“Do you want to enable Autosteer while it is in Beta?”

Nope, nothing there at all suggesting, let alone guaranteeing, that it will prevent all accidents.

When do the torches and pitchforks come out?

People are stupid and when they do something wrong they look to blame others. Like the idiot in the PA accident that lied. Other than hitting these fools with a hammer to get their attention does anyone have any other ideas. I feel before Tesla sells Autopilot the user much watch an instructional video and pass a test. if they fail, no Autopilot them

I don’t think he “lied” as autopilot had turned off while he was half asleep. Not saying he wasn’t acting stupid/dangerous, but I don’t think he intentionally lied. In his mind AP was still enabled.

I don’t think he was sleeping, since he had his son in law as a passenger and they were probably shooting the breeze. But I guess it’s possible they were both sleeping. What might have happened is that the driver mistakenly thought placing his hands on the steering wheel (11 seconds before impact) shut off the nag alerts and that Autopilot was still engaged. This scenario is described in the following Reddit post: ebob5030 Model S 85 • 1d, 11h “There is an interesting nuance in the way AP works that could help explain what happened here. When you respond to the nags, you need to hold the steering wheel (by applying a slight turning pressure) to tell AP that you are awake. It then stops nagging and resumes normal AP operation. But if you turn the wheel with just a bit more force, it disengages the auto-steer. When you hear the beeping/nagging, it is easy to overreact and turn the wheel enough to disengage without realizing what you just did. It has happened to me and it took a moment to realize that auto-steer is now off. Fortunately I realized it before the car drifted out of the lane.”… Read more »

… “realized it before the car drifted out of lane”

Really. What difference does it make knowing auto-steer was off? Aren’t these people claiming to be monitoring everything all the time ANYWAY?

I’ve yet to hear anyone who defends this silly “car that tries to drive itself, though incompetently” technology give a convincing explanation of what the point is with such a thing. It seems blindingly obvious to me that it must be MORE work to monitor surroundings and traffic AND the autopilot itself than to drive regularly. I read some claims that AP “frees you to monitor OTHER things”, but that directly contradicts the instruction to continue to take full responsibility of every aspect of driving. For example, you simply CANNOT ignore position in the lane or distance to the car in front. I would like to hear specifically what it is you can supposedly stop focusing on with AP engaged that you do need to focus on with it disengaged.

You have to start from easy, then complex tasks to appreciate AP. I may not generally move hands much more than 4″ from the SW, when the car is at highway speed, but in stop and go highway traffic under 15mph, I’ll usually turn away or use hands to grab something in the car. At slow speeds, the freedom is very good/useful and predictable.

One thing I sympathize with, in the PA wreck, is re-adapting to driving after AP has been doing the work. The guy was 11 seconds from AP being engaged. I don’t blame AP, then, but I have noticed my skill and wits are not the same in those first few seconds of resuming manual driving as when I’ve been driving manually for a long time. AP can do narrow lanes, construction, etc. It’s very good, but when YOU suddenly take over near 55mph though that type of thing, the anxiety level is higher. So, I wonder if the “human” element in this sweet spot of time, after AP, won’t be the source of more trouble.

“There is an interesting nuance in the way AP works that could help explain what happened here.”

The fact that the car had to gave the (non-)driver a series of successively louder and more insistent warnings that he needed to take the wheel, all of which were ignored and so the car went into its “pull over and stop safely” routine, before the driver finally took the wheel, does not suggest to me he was merely talking to his passenger. That suggests that he was asleep, drunk, or high.

We can of course spin a long series of increasingly less likely scenarios, but I suggest we follow Occam’s razor and go for the most likely explanation, rather than the least likely.

maybe they saw the youtube video in which the driver was driving a tesla while “asleep” and they thought that it seemed like such a good idea that they would do the same.

as i’ve stated before, many tesla drivers seem to have more dollars than sense…

I have absolutely have NO problem with the name of the feature suite. If I see a manufacturer offer some feature in an automobile, you better believe I’m going to find out what the hell it does! “People are lazy and most of us know everything”, is about the sorriest cop-out one could come up with.
If you don’t know or understand what a company’s proprietary name for a feature means, then ASK!

Everyone that uses Autopilot needs to know that it is an enhanced cruise control, not a level 3 autonomous car.

Just like regular cruise control, autopilot cannot detect stationary objects in the roadway or cross traffic. The driver must be alert and monitor the autopilot, just like airline pilots must do.

Monitoring is not as fatiguing as manually steering, just like cruise control is less fatiguing than manually driving. This adds value and safety.

Autopilot can assist drivers to drive safer than they can by themselves. Just don’t think it is supposed to be autonomous driving, instead of the driving aid that it is.

GSP

all of this came about because of tesla’s undisciplined launch of the newest, coolest “toy”. then, elon musk allowed his gadgeteering orientation to over-hype the thing.

as i’ve stated before, i can’t believe that the tesla legal department didn’t see this coming, because if they didn’t, they should get summarily fired. i’ve got to believe that elon musk overruled their advice in the interest maintaining a “cutting edge” image for tesla. the result has been a reckless deployment of a feature that should have stayed in controlled testing.

as i’ve also stated, gm would have *never* done something like this…

Predictably the Tesla fanbois are upset by the fact that multiple lawyers consider Tesla actually liable for the autopilot if it can be shown to have caused an accident. I wonder if these people are also upset that people are liable when they cause accidents.

I have long been a big Tesla fan, but almost every day now I doubt more and more if I really want to be associated with the cult. Because many of the fanbois, and at times our high priest Elon himself, makes it look like just that – a mindless sect.

Sorry, but when nearly EVERY article on InsideEVs is accompanied by twice as much text authored by the same intensely biased Tesla-fan it is a little bit exhausting.

I hereby authorize you to “sleep through” the next 10 Tesla articles.

You are exhausted and should give it a rest.

The 11th Tesla article will probably be published early tomorrow morning. That’s not much of a rest that you’re giving him! 😉

Ha, tru-dat.

electric-car-insider.com

I find it rather odd that Tesla’s disclaimer screen tells drivers that Tesla AP operators have a responsibility “similar to autopilot in airplanes…”

How many automobile drivers know anything about how autopilot in an airplane works?

I don’t know how many do, but anyone who researches it will soon realize that Tesla’s autopilot does for an automobile driver almost the same thing as what an aircraft autopilot does for a pilot.

They are very similar in a lot of ways, and require the same attentive oversight from the operator as well.

How much training each “operator” receives? …. Well that is a different matter.

ECI said:
“How many automobile drivers know anything about how autopilot in an airplane works?”

Absolutely everything I know about how autopilot in airplane works comes from the movies. 😀

I’ve really got to stop reading the comments, as it is clear that no one is changing their minds from preconceived notions.

I continue to add that anyone who wants to make a conclusion should drive a LONG trip with AND without AP and decide for yourself which makes you a safer driver. I have been doing the same 1400 mile trip for several years between my two homes. The fatigue that accumulates (I do this trip in 2 days if I am not towing) is RADICALLY reduced when you don’t have to manage lane keeping CONSTANTLY. I will be in charge in construction zones and the like, but JUST like cruise control, this is a handy tool to pass off drudgery to.

100% agree. I generally don’t like long days of driving but in my 3400 mile round trip from IL to MT where the Model X drove 95% of the time I could much easier handle the drive and enjoy the scenery (as well as look out for antelope!). I came home faster than going there and my last two days were 540 miles. On way out more like 350-450 which were a breeze with autosteer/pilot.

Perhaps a long trip on AP would change my mind, but I don’t get this at all.

Cruise control lets me rest my foot in a comfortable place instead of having to constantly apply pressure on the accelerator. That’s a physical benefit. Cruise also lets me keep my eyes on the road instead of repeatedly refocusing on the speedometer and adjusting my speed to avoid flashing light syndrome.

Does autosteer let me rest my arms? Does it let me enjoy the scenery? Well, yes, if I violate Tesla’s instructions. And those are very real benefits. But they’re “illegal”.

maybe the reason why you points are not being responded is that you aren’t really stating much of significance.

the state of autopilot feature is that you really have to use the feature under constant monitoring such that you can be ready to take manual control at an instant’s notice. as “terawatt” stated in another article, that isn’t exactly a “relaxing” driving experience. so if you are intending to use the beta-test autopilot feature to relieve fatigue, you need to pull off the road instead.

Have you driven a Tesla on autopilot?

I think it is extremely clear based on his comments that ‘no comment’ has not driven with Tesla autosteer/pilot and if they have it was not for any length of time. It is very easy to get use to and understand its intended operation. Adaptive (traffic aware) cruise control is nice but in combination with AutoSteer it is fantastic for highway driving.

Yes, and it was you, along with a couple others who are the immovable objects. Your first sentence says all that needs to be known about WHY you are an immovable object. Everything else is irrelevant about your opinion and your values. You write like a complete narcissist, one whose opinion is much more important than anyone else’s…and oh, no one else has anything important to say anyway. Don’t pay attention to any of their points because you’ll act like no points were made.

Don’t worry, I will not be engaging you directly again. One of my favorite admonitions- ‘Don’t try to teach a pig to sing; it wastes your time and annoys the pig.’

God, the condescension and arrogance found in that one sentence is just amazing. Barf.

that attempt to analogize cars to planes and boats is off the mark because planes and boats are so much different from cars. for example, there are relatively few boats out there and the typical person does not have to worry about being run over if you operate your boat recklessly or while drunk.

tesla really should have limited autopilot to professional or a limited set of specifically trained drivers. you can call that being an “immovable object”, but there is a world out there, most whom are not ev-enthusiasts. litigation attorneys, congress and consumer reports are not viewing the autopilot feature through the prism of the ev-enthusiast.

“no comment” commented:

“maybe the reason why you points are not being responded is that you aren’t really stating much of significance.”

Wow! Dude, you just broke my unintended-irony meter. Try looking in a mirror, and say that again.

It sounds like some people have a very, very poor understanding of how autopilot actually works for airplanes if they think autopilot eliminates the need for the operator to be in charge and stay attentive. Here are some experts from the FAA training guide for using Autopilot in an airplane: “While the autopilot relieves you from manually manipulating the flight controls, you must maintain vigilance over the system to ensure that it performs the intended functions” “Be ready to fly the aircraft manually to ensure proper course/clearance tracking” “As with all automated systems, you must remain aware of the overall situation. Never assume that flight director cues are following a route or course that is free from error.” “The first priority for a pilot always is to fly the aircraft.” “Verification of the autopilot mode and engagement status of the autopilot is a necessary technique for maintaining awareness of who is flying the aircraft.” “You must always ensure the correct altitude or vertical speed is maintained.” “While it is easy to be complacent and let down your guard, you must continuously monitor and stay aware of automated systems status and function” “You must remember that the altitude alerting system is… Read more »

1) Gross Negligence is NOT just writing something that somebody finds confusing. Writing something that somebody finds confusing comes nowhere near proving Gross Negligence.

2) It isn’t going to be sufficient to just argue that the word “Autopilot” makes people use the system incorrectly. They would need to put witnesses on the stand and have them testify what they believed about the term “Autopilot”. That opens the door wide open to question the witness about whether their understanding of what “Autopilot” means is accurate. When they pull out FAA guidelines, and simply show that it was the witness’s ignorance of Autopilot that is the source of the confusion, the case goes down the tubes.

3) A car company can’t be held liable for other people having a failed understanding of how Autopilot is operated in an airplane, and then making false assumptions based upon their own incorrect beliefs.

Of Facebook, Twitter and Instagram ateare not held liable for the user created content they deliver, why should Tesla be held accountable for the actions of their drivers. How many non adaptive cruise control cars have run into the back of the car ahead of them? Should Ford be held accountable for the numerous accidents and damages to property from the copious mustang wrecks leaving cars and coffee events. Buck up people, the time of personal responsibility is, hopefully, coming.

By the same logic in some of these posts it could be argued that every gun sold in the USA should come with a warning “don’t point this at anyone and pull the trigger”. The relatives of the thousands of people shot every year could then sue the gun manufacturers.

They can’t sue gun manufacturers because the U.S. Congress has passed laws protecting the gun manufacturers from liability lawsuits in nearly all cases. Since the law was passed in 2005, according to Wikipedia, there has been only one successful liability lawsuit against a gun manufacturer.

Corruption in the U.S. Congress finds almost no bounds at all.

https://en.wikipedia.org/wiki/Protection_of_Lawful_Commerce_in_Arms_Act

Sad, but true.

The name “autopilot” isn’t inappropriate. Autopilot on a plane still requires the pilots to monitor the aircraft at all times. That, combined with the fact that the feature is disabled by default and gives warnings prior to enabling it means lawyers will have a difficult time arguing this one in court, especially if they think they can use attractive nuisance doctrine. I’m curious to know how many children are driving Teslas.

The only time you can apply attractive nuisance to an adult is if the adult is injured while trying to rescue a child from an attractive nuisance (like an unsecured pool). If we start applying the doctrine to adults anytime we do something fun that can be dangerous, think of the results.

Where’s the Lemon Law King!?

A lawyer is looking into Harry Potter franchise to sue them because DVDs does not warning about watching and driving.

So, again, I will ask: WHAT’S THE POINT OF AUTOSTEER IF YOU CAN’T REST YOUR ARMS BY LETTING GO OF THE STEERING WHEEL?!?

I think this whole debacle has just saved me a few grand when it’s time to configure my Model 3. I won’t opt for AP, if it’s still an option at that time.

I’ll still have the adaptive cruise control and all of the proximity warnings and rear-view camera as standard features based on the standard AP hardware.

some day the feature might get there to where you will actually be able to not keep your hands on the wheel. but for the present time, this is a beta-test feature. maybe you should pass on buying a beta-test version of the feature and wait until it is in a production ready state.

“So, again, I will ask: WHAT’S THE POINT OF AUTOSTEER IF YOU CAN’T REST YOUR ARMS BY LETTING GO OF THE STEERING WHEEL?!?”

YOU’VE BEEN ANSWERED , …. AGAIN!
READ THE REMARKS FROM PEOPLE WHO OWN THE CAR!!!

Well where there’s smoke there is usually fire… For me, as I’ve said before, I wouldn’t use this system at any speed above 2 mph.

Too many serious people are getting involved in these cases, namely Lawyers, CR, and congressmen.
(CR I consider the least serious – ever since they rated the “S” ‘over 100%’ – a silly construction – and then ‘not recommended’ it.

…Naysayers will say they were blind-sided by the reliability problem. I counter there were other factors at work, namely personal gain with certain CR employees. Of course, they can say they were just ‘dumb’ about things… Nice Plausible Deniability.)

The way these Tesla fanatics defend their beloved messiah Elon Musk and his company are truly disturbing. I thought Apple fan persons were emotionally invested, but these Tesla people are beyond ridiculous. What are these silicon valley companies doing that turns people into mindless cult members? They’re so obsessed with this guy and his company that it’s pathetic if not downright weird.