Tesla Autopilot Saves Man During Medical Emergency

AUG 11 2016 BY STEVEN LOVEDAY 39

Tesla Model X

Tesla Model X

Springfield, Missouri attorney and Tesla Model X Owner, Joshua Neally, told Slate that his car saved his life. Branson native Neally found himself in the midst of a possibly fatal medical emergency and was able to have his Tesla Autopilot transport him to the hospital.

Autopilot In Control

Tesla Autopilot In Control

Neally explained that he had been having some strange chest pains for a number of days but shrugged it off as a pulled muscle. He was headed on his usual commute home from work with plans to enjoy his daughter’s fourth birthday celebration. However, he didn’t make it home as scheduled. Fortunately, he made it to safety and is alive and well to share the story.

As Josh left Springfield and proceeded onto the local freeway, his pain became unbearable. He explained:

“It was excruciating pain. I’ve never had such pain in my life.”

It turns out, Josh was experiencing the pain associated with a pulmonary embolism. A blood vessel in his lung was blocked and it was causing him to have difficulty breathing and talking. He even got to the point at times that his vision was blurred. Josh was able to call his wife and get to the hospital.

Luckily, the hospital was close to a freeway exit. Josh said he engaged Autopilot and only had to touch the steering wheel periodically. The car changed lanes and passed other vehicles on its own. He only had to manage directing the car in the final moments upon exiting the freeway. Josh promptly checked himself into the ER at CoxHealth facility in Branson, Missouri.

He told interviewers that he realizes that pulling over and requesting emergency services may have been the better choice, but nevertheless, his Model X took care of him. He even admitted to occasionally checking email and texts in the past when Autopilot was at work. Neally said of the Tesla Autopilot Mode:

“It’s more like the ultimate cruise control . . . I definitely believe it helped me.”

Thankfully, his condition was relieved with the dosing of blood thinning medication and he was home within hours. His daughter was safe with some friends until he arrived. He was even able to head out and celebrate her birthday.

Slate interviewed Neally and reported the story ahead of other press. Reporter Will Oremus added his thoughts:

“Neally’s experience is unusual. It doesn’t prove autopilot’s worth as a safety feature any more than Brown’s death disproves it. Yet Neally’s story is the latest of several that have emerged since the Florida crash to paint a fuller picture of autopilot’s merits, in addition to its by now highly publicized dangers. These stories provide at least a measure of anecdotal support for Tesla’s claims that its own data show autopilot — imperfect as it is — is already significantly safer than the average human driver.”

Source: Slate

Categories: Tesla

Tags: ,

Leave a Reply

39 Comments on "Tesla Autopilot Saves Man During Medical Emergency"

newest oldest most voted

I still think it would have been better to call an ambulance. What if he blacked out and was on the highway? This easily could have ended tragically.

Precisely.

“He told interviewers that he realizes that pulling over and requesting emergency services may have been the better choice”

Yeah but then he wouldn’t get to make a news story out of it.

I mean who wants to hear about that…

True. However I’d bet if I were experiencing an acute PE my instinct would be to get myself to a hospital NOW especially if I knew it wasn’t far away. Hard to be coherent and rational when you’re in a life threatening emergency.

Exactly… Your first instinct is not to pull over, it’s to survive. Instincts are intended to help you survive, not be safe or rational. It doesn’t make it right, but I’ll bet most people would do the same thing.

I know he’s trying to promote AP as being life-saving. But this is in incorrect application and really shouldn’t be advertised…the next person who tries this may not be so lucky, and may even injure others while making the attempt.

I thought it was pretty straightforward report. Why must everyone assume that someone will be stupid about things? Sure there exists nutcases, but they will do nutty things regardless. I mean, there is even people nutty enough to vote for trump, does it mean that we should stop reporting on him?

Would you say the same thing if the report stated that a person who was drunk used Auto-Pilot to get home safely?

The drivers in both cases are impaired and shouldn’t be behind the wheel.

My opinion remains that this driver should have pulled over and dialed 911 for his safety, and the safety of others on the road.

JH said: “…there is even people nutty enough to vote for trump, does it mean that we should stop reporting on him?” This is totally off-topic, but since you asked… If the news media hadn’t focused on The Donald’s rambling, incredibly ignorant, hate-filled, self-contradictory rants, like moths hypnotized by a candle flame; if they hadn’t given him ginormously more attention than he ever deserved; then he never would have become the Republican nominee for even assistant dog catcher, let alone nominee for being the President of the United States! The news media occasionally does light-hearted segments on very minor party candidates; the interviews with the Beer Party candidate a few years back were fun. That’s the sort of coverage The Donald deserves, and nothing more. The news media treated The Donald as a serious candidate, which he absolutely does not deserve, because he most certainly is not. Therefore, the news media deserves much or most of the blame for The Donald getting attention, gaining political influence, and getting so many votes. “With great power comes great responsibility.” — Spider-Man comics The news media has great power. Unfortunately, they’ve almost entirely abandoned their responsibility, and the rise of The Donald is… Read more »

Let me borrow that soap-box…

Not to dive deeply into the actual politics, but one candidate is actually being intentionally bombastic purely with the intent of pushing the press to talk about him, and nobody else. Here is an insight into his thought pattern after the most recent intentional falsehood flooded the media:

[Radio talkshow host objecting to falsehoods]: And that’s, I’d just use different language to communicate it, but let me close with this, because I know I’m keeping you long, and Hope’s going to kill me.

[candidate]: But they wouldn’t talk about your language, and they do talk about my language, right?

Interestingly, it is working, but not having the results he planned. And while he is getting the attention he wants, it is also flooding out the coverage of even his fellow party members down ballot who are also fighting for press time. They aren’t getting press time either. He’s hurting his party by intentionally playing into the press’s jack-rabbit response to the most absurd statements he can fabricate.

Autopilot is there to reduce the physical effort of driving. That’s what it did.

His other admitted uses are actually worse than his use this time.

I wonder the next published petition of Consumer Reports about Autopilot.

> ” He even admitted to occasionally checking email and texts in the past when Autopilot was at work.”

Oh boy! His days are numbered.

I was really disappointed that Elon Musk retweeted a story about this incident. Tesla is supposed to be clarifying proper Autopilot usage, not adding to the confusion.

He had to do it. It is a counterpoint to the confusing negative articles. Look, the truth of Autopilot is clearly between the two news extremes (killer, lifesaver), but he would be doing a severe disservice to Tesla shareholders if he didn’t blare the trumpet on this one, considering how much press the crash generated.

I disagree. You fight misinformation with facts, not with more misinformation. His duty to shareholders is to do everything in his power to ensure that drivers are educated in proper Autopilot usage, thus protecting Tesla from negative press and from liability.

No, his duty to Tesla shareholders is to promote sales of Tesla cars as much as possible.

You’re complaining that he did exactly that.

You could rightly argue that he has a larger responsibility to society at large, a duty to not play up irresponsible use of Autopilot/AutoSteer, but that’s outside his duty to Tesla stockholders.

We had a cyclist die during a charity ride in Tucson because a car driver in a Leaf experienced a low blood sugar due to his type 2 diabetes. There will be times when autopilot saves lives when nothing else could have helped. Cyclist are hopeful that autopilot help save lives. Lets celebrate the successes of these systems no matter who builds them as the hope is that we will all be safer!

Except that once again it isn’t autopilot, but automatic emergency braking, that could potentially save the cyclists in such cases. The two are “distinct and separate” according to Musk, don’t you remember?

There seems to be confusion over this point, and Elon has an unfortunate habit of shooting his mouth off (or tweeting) before checking his facts. He may have been wrong on that.

At any rate, my understanding is that elsewhere, Tesla Motors has said the emergency braking system is now considered part of the Autopilot suite of driver assist features. Until Elon clarifies the matter, some of us (including me) are going to continue to presume that’s correct.

I don’t know if this was the best example.. but the passing out scenario is one where the autopilot could excell.

I would definitely want a car that would put me out of danger if passing out when driving instead of ending up hitting another car or a tree.
Even better if it could monitor me and my vitals even when the autopilot is not engaged to step in (or rather ask if I’m alright before stepping in).
Combined with autocall to an emergency call center and gps-coordinates sent (like when you crash).

I’m just waiting for “Blue”:

In the next over the air update of Autopilot I would not be surprised to see the car safely pull over and stop if the driver is unresponsive (no hands on wheel) followed by an automated call to 911.

This is an automated call from Tesla Model S, the driver is unresponsive and I have pulled over and am located at the following GPS coordinates, please send emergency personnel.

NPNS! SBF!
Volt#671

Any automated call to 911 had better be preceded by some very strong attention-getting inside the car, such as blasting noise thru the stereo at a high volume (it goes to 11, remember?).

I don’t think emergency services would appreciate responding to a 911 call only to discover the driver simply fell asleep behind the wheel.

I’d pull over and press the OnStar button. Oh wait, Tesla doesn’t offer OnStar.

Yes, since by allowing the autopilot to drive you will because you are not able to drive yourself you are endangering others.
Maybe they should add a driver incapacitated button, that would try to pull you safely over out of traffic and slow the vehicle, and this Tesla and they can send aid.

I’m glad he made it to the hospital, but he may be confused on some of the details… understandable under the conditions.

For example, he says the car changed lanes on its own. If my understanding is correct, Autopilot/ AutoSteer won’t do that. Ever. The human driver has to use the turn signal to activate the automated lane change feature.

It seems to me as if it would be more accurate to say that Tesla Autopilot/AutoSteer helped him get to the hospital in time to save his life, rather than suggesting Autopilot/AutoSteer did that on its own.

Pretty cool driver assist feature.
I think this just adds evidence that smart features will push cars in the right direction on the accidents and fatalities charts, as other innovations have, with some notable exceptions.

In a few years statistical probability indicates there will be more such possibly ‘life saving’ events. Also more negative stories of failure of some smart feature.

When a person actually does die with all driver assist features activated, just before they finish their long goodbye, they can program their car, if still able, to drive them to the city morgue.

If I had and used the autopilot today there would have been one more motorcyclist dead in the UK. Yes, that would have been entirely his fault (overtaking on a bent with poor visibility and setting up himself for a high speed head on collision) but still, he wouldn’t be here with us now. I’m genuinely curious how any autonomous driving system would deal with a situation like this. Both I and the driver of the other car had to go half way off road at full speed to make just enough space for him to squeeze through.

In the current state of development, Autopilot/AutoSteer won’t change lanes (or drive off the road) to prevent a collision, but this is something we should see quite soon. Simple collision avoidance programming routines aren’t all that difficult; computer games have been using those for decades.

In fact, the self-driving car would react much more quickly and decisively in an emergency than a human driver, which will reduce the risk of accident even more.

There are things which will be challenging for programmers working on autonomous driving software. Simple collision avoidance isn’t one of those things.

Actually, the driver killed in the Semi Trailer crash in Florida, had posted a YouTube Video, about a Month before, showing how the system braked and moved right, to avoid being side swiped on the left side by a merging bucket truck that was trying to go to the pending exit ramp. So, maybe it can do other things too!

This video?

The Model S moves right within the lane to avoid the truck, presumably automatically under the control of Autopilot/AutoSteer. But it doesn’t stray out of the lane, even slightly.

You are right that autonomous drive systems could act more effectively than a human driver. The bottleneck is in perception and interpreting data. All current systems are still really dumb, based on a set of “mechanical” rules, not unlike language translation software. Just distingushing a person from a bush could be a very hard, not to mention guessing other person’s intentions. No one knows how far we can push data and algorithms without inventing human-like AI.

Computer games bypass this problem because “the world” is fully defined in the program itself. This isn’t necessarily a bad assumption, there is nothing stopping us from building dedicated lanes and roads or repurposing existing ones.

Tesla the Savior.

I still think calling it Autopilot is a bad thing and sets false expectations.

I’d rather call it a suite of drivers assistance features.

No, I’m not a lawyer.

How about drunk drivers who try to get home? With Autopilot, it will be less noticable for cops to spot a drunk driver..

The good: Autopilot/AutoSteer will help keep the car in the lane, and TACC plus the automatic emergency braking system will help to prevent accidents.

The bad: The fact that the car is equipped with Autopilot/AutoSteer may make a person feel safer about driving drunk, so may possibly encourage such behavior.

But we can’t assess the overall result. Perhaps the benefit of the good outweighs the increased danger from the bad; perhaps the reverse. Since Tesla does not make its collected Autopilot data public, we can only speculate. (And if that data was made public, that would merely create endless arguments over the interpretation of that data.)

ugh.

That’s the logic of parents who don’t get their daughters immunized against HPV, because they think their daughters will become sluts.

The fact is that drunk drivers don’t think about consequences, or they wouldn’t be drunk drivers in the first place. They certainly aren’t going to suddenly think through the entire chain of consequences ahead of time, just because they have autopilot. They are going to drive drunk without any analysis of consequences anyways.

It is irrational to assign rational thought patterns to irrational people.