Tesla Autopilot Versus Bicyclist – Video

Tesla Autopilot

JUN 27 2017 BY STEVEN LOVEDAY 40

Will bicyclists DIE because Tesla Autopilot won’t see them?

Our good friends, KmanAuto and past InsideEVs contributor, Mike Anthony, are at it again, but this time it’s with a bike. Not long ago, a Stanford University researcher, Heather Knight, published a controversial Tesla Autopilot review entitled “Tesla Autopilot Review: Bikers will DIE” . Of course these two had to put Mike’s life at stake to see if Knight’s title is accurate.

Editor’s Note:  Please do not try/attempt anything you see in these videos yourself, as it is not something InsideEVs’ would suggest to do.

Tesla Autopilot

Tesla Autopilot Bike Detection

Mike rides a bike right in front of the Tesla Model S multiple time, for a series of tests. The tests are run from 18 mph up to 30 mph. The Tesla Model S has the first-generation Autopilot system and firmware 8.1 (17.11.10).

The tests cover human on bike, bike alone, moving bike, stationary bike, bike perpendicular to the car, bike facing the car, and bike traveling in the same lane with the car.

Further to the video, Teslarati spoke with Chis/KmanAuto about his feelings regarding the improvements to Autopilot 1.0. He shared (via Teslarati):

“The switch to 8.x firmware, where Tesla used new programming to process the radar information, dubbed ‘enhanced radar’, brought Autopilot up to where we’d expect and able to detect humans and most large animals (dogs, deer etc).”

Check out the video and Kman’s notes below to learn about their findings, and to see if Mike survived.

Video Description via KmanAuto on YouTube:

Human on a Bicycle TESLA Collision Tests Will Mike go SPLAT!
Tesla Autopilot Kill Bikers? LETS FIND OUT! Human Collision Test!
—————————- Prompted by an Article written by
“Heather Knight”, a Roboticist and
researcher at Stanford University,
Titled “Tesla Autopilot Review: Bikers
will DIE” , Myself and my Trusty
sidekick Mike, took it upon ourselves
yet AGAIN to Test out Tesla Autopilot
against a Real, Live, Human on a Bicycle.
Here are our Results…

SUCCESS It detected and stopped for a bike
in the middle of the lane. Though, it
saw the bike as a “Car”, most likely
due to it being long ways in the lane,
the width reflects back to the radar
more so like a “Car” then a bike.

SUCCESS This time, even though I was
quite far back when I activated
Autopilot, it detected the bike so
quickly & soon that it wouldn’t even
let the car get to the set speed of
25mph.

SUCCESS The Tesla’s Radar saw Mike & bike,
and stopped for him. It appears due to
the bike being Long-Ways in the lane,
between the Radar & Camera, it had
detected at distance, however, became
“Unsure” of itself once stopped. Tesla’s
are programmed on the side of caution,
as such, required driver to check for
vehicles & people before resuming.

SUCCESS & Partial Fail Interesting Result! Somewhat expected
though based on our tests from last year.
The Tesla Model S slowed down, and followed
Mike. Once Mike moved over to the shoulder
of the road, the car attempted to resume it’s
course and speed. Unfortunately, at this
point, the radar could no longer see the Bike
and the Side Sonar did not appear to detect
it quick enough before Driver took over, and
could have resulted in a Minor hit. Though, also note, We are testing using the
old Autopilot 1.0 Hardware. Autopilot 1.0 Sonar
has a maximum range of 4ft (1.22 meters), where
as Autopilot 2.0 hardware (What is being used
currently in Tesla Production for Model S, X, & 3)
has a range of 8 ft (2.44 meters) may be enough
to have detected the bike from the side. Also,
hardware 2.0 incorporates the use of 8 cameras
giving a 360* Visual to the computer, where as
autopilot 1.0 only has a single forward facing
camera.

PASS/FAIL Pass = The Model S detected the bike
and slowed the vehicle down. Fail= Vehicle did not bring itself
to a complete stop.
Instead of a complete stop, the Autopilot
Panicked and sounded collision alert. In this
case, the swerving “drunk driving” of Mike,
brought him in and out of the radars detection
zone. As such, when the car was too close to
maintain a lock on him, sounded the collision
alert for driver to take over.

Complicated I, as the driver became nervous. I touched
the steering wheel, result, deactivating
autopilot. However, after touching the steering
wheel, Autopilot deactivates and becomes TACC
(Traffic Aware Cruise Control). TACC is similar to
Autopilot, only does NOT steer the car. It can
still bring the vehicle to a complete stop and
activate collision avoidance. In this case, TACC slowed the vehicle down
until the bike was clear of the vehicles path,
then the car resumed it’s speed.

Mixed Emotions The car did see the bike last second, and did
attempt to slow the vehicle. However, it did not
do so quick enough. The car COULD detect the
bike at a far enough distance away, however,
it took too long for the car to be able to
determine if the bike was going to cross paths
or wait for the car to pass. By the point it made
a determination, Accident Avoidance became
Accident Mitigation, or in other words, the car
did it’s best to reduce the damage that would be
caused from a inevitable collision.

Conclusion While Autopilot is not Full Autonomous, it
has remarkable Collision Avoidance features
especially for only having Autopilot Version 1.
With Version 2 hardware out and the software
updates bring all it’s hardware online piece by
piece, the only thing that can happen is more
improvements. While shaky, I’d need to say the
only way Autopilot will kill Bikers is if the Driver
of the car is Drunk, completely distracted (Mis-
Using Auto-pilot as if it were Full Autonomous)
or asleep. Used as designed, it will SAVE lives.

 

Categories: Tesla, Videos

Tags: ,

Leave a Reply

40 Comments on "Tesla Autopilot Versus Bicyclist – Video"

newest oldest most voted
Kosh

Uh… Crash Test dummy would have been better?

carcus

v2v would be better.

Kdawg

Nice video. To me, all the scenarios ended up being safer w/AP.

I’d like to see a test done w/Styrofoam, and then if it it hits.. no big deal.

sven ¯\_(ツ)_/¯

I believe that the radar on the Tesla is unable to detect Styrofoam as opposed to a meat-bag on two wheels.

I recall an InsideEVs article about Bjørn Nyland testing the automatic emergency braking on a Model S with very large sheets of Styrofoam to simulate solid concrete walls. The AEB system on the Model S couldn’t detect the Styrofoam sheets and smashed right through them. Bjørn quickly pulled the video when people commented that the radar was passing though the low density Styrofoam rather than reflecting the radar signal.

sven ¯\_(ツ)_/¯

At first my Google-foo failed me, but subsequently I was able to find the InsideEVs article that I referenced above.

http://insideevs.com/bjorn-nyland-tests-tesla-model-s-automatic-braking-feature-video/

Kdawg

Might have to paint it something.. like metallic paint?

Pushmi-Pullyu

As I understand it: Styrofoam, being mostly air, does not reflect enough of radar’s long wavelengths to be detected. At least, not in thin sheets like Bjørn used.

So yes, a styrofoam dummy would need to be coated with something that would reflect radar. Maybe duct tape? Of course, that’s not a very good test. Neither styrofoam nor duct tape is a good radar target simulation of human flesh and bone.

What we really need is for “Myth Busters” to do one of their shows on the subject, testing the Tesla Autopilot’s ability to “see” on of their dummies made of ballistic gel and… is the skeleton inside an actual human skeleton, or a simulation? But either way, that’s what would be most appropriate for a real test.

Kdawg

They like to use pigs as human analogs a lot. Though hitting a pig at 25mph could do some damage to a $100K car.

Scramjett

It’d make for some good pork though.

MarkT

I never did like Mike

Ev4life

Looks like Stanford needs to get better researchers.
Real results seem to be dramatically different from the theoretical.

Scramjett

There’s a good deal difference between an academic research institute and two dudes with a bike, car and camera.

But I would be curious what Stanford’s methodology was.

Nix

The methodology was that they rented a Model S and went to the beach, then posted a non-scientific blog of her “first impressions” after a single drive:

https://medium.com/startup-grind/tesla-autopilot-review-bikers-will-die-212a8be4d8e7

They went in cold with no knowledge or understanding of autopilot, or of driving a Tesla at all. And between taking selfies of themselves at the beach, they managed to lock themselves out, didn’t know how the car shut off, and failed to understand how “resume” works on pretty much every cruise-control car in the world.

There was no “methodology”, they estimated their statistics without doing any actual data collection, and they wrote their results as if they were reviewing a fully autonomous system, saying it wasn’t ready for full autonomy. Something that Tesla is VERY CLEAR that this system is not.

Nix

It should also be noted that in a number of those test cases, it was actually the bike rider that was breaking the law, such as swerving in and out of a lane of traffic, failing to stop and yield, etc.

While we certainly would love to have our cars save us from everything, at a certain point a bike rider who is breaking the law is responsible for killing themselves.

Eric Cote

If we could assume that other drivers will not break the law, we wouldn’t need sophisticated autopilot hardware, nor would we need to offer defensive driving courses.

Unfortunately, other vehicles “not breaking the law” is not an assumption that Tesla can make if they want their Autopilot to be successful and safe.

Pushmi-Pullyu
Right. The objective should be to avoid accidents, and especially to avoid colliding with any object large enough to contain even the smallest human beings: a baby or infant. It’s absurd to even talk about programming self-driving cars to obey traffic laws. Sure, engineers and programmers developing self-driving cars have succeeded in getting the cars to recognize speed limit signs, stop signs, and stop lights, and follow programmed instructions when those are detected. But it’s impossible to program a robot or a car with a more general instruction such as “obey traffic laws”, because that would require a level of comprehension of the human world, and a level of self-awareness, light-years beyond the current state of the art of robotics and computer programming. To repeat: The smartest robots today have about the intelligence of a moderately smart bug. Just think about trying to teach a bug the concept of “traffic laws”, and you’ll realize the impossibility of that. I think part of the problem is the misuse of the term “A.I.” (Artificial Intelligence) for expert-systems software. What developers of autonomous driving systems are developing is expert-systems software to drive a car. Not actual intelligence in the car; just expert-systems software.… Read more »
Nix
Clarkson — Perhaps I didn’t make myself clear with my original short post. Yes, obviously the goal of autopilot and all advanced safety system is to avoid all collisions regardless of fault. What I’m commenting on is the original BLOG this is referring to. In that blog, Heather Knight said autopilot would: “put biker lives at risk”. https://medium.com/startup-grind/tesla-autopilot-review-bikers-will-die-212a8be4d8e7 My point is that while it would be nice if we could avoid all accidents, our road laws and liability laws still exist for a reason. What these test results with actual bikes does is point to the fact that it is NOT autopilot’s actions that put biker’s lives at risk in this testing. It was the biker violating the law that put the biker’s life at risk. Hopefully that clarifies my objection to the fake meme that autopilot “put biker lives at risk”, when these tests show it is the biker’s own violations of the law that puts their life at risk. If autopilot can save bikers from their own violations of the law causing their own lives to be taken, that’s great. But in the inevitable cases where it doesn’t work out that way, and a biker dies through their… Read more »
Scramjett

There is one key flaw in your logic. When drivers break the law, they are putting the lives of others, including cyclists and pedestrians, at risk. This is a key point so many people miss. When cyclists break the law, it’s their own lives at stake. Again, to emphasize the point: when drivers break the law, it’s not just their own lives, but the lives of ALL road users! PARTICULARLY cyclists and pedestrians who do not have several thousand pounds of metal, plastic and glass enveloping them.

Incidentally, I’ve observed that most (not all to be sure, but most) cyclists break the law because of the woefully inadequate cycling infrastructure in this country. Cars have their own ROW, pedestrians have their own ROW, cyclists are forced into one or the other. And no, bike lanes don’t count. Drivers are not allowed onto sidewalks, but they ARE allowed to drive into, and park in bike lanes. And usually with impunity.

Nix

Please go back and re-watch the video. At no time was any of the testing simulating a car driver breaking the law. However, they most certainly did simulate the bike rider breaking the law, and that was where they had mixed results in their testing. That is what I was commenting on.

Of course car drivers shouldn’t break the law. I don’t know what I said that you now object to, because at no time did I suggest it was OK for any car to break the law.

It seems like you are just using the words “bike”, “car” and “law” to make a huge leap into posting about your dissatisfaction about bike lanes. What does that have to do with anything I posted?

Are you trying to justify breaking the law on a bike, and trying to justify shifting the blame away from the person breaking the law, just because the biking infrastructure doesn’t suit your desires?

John in AA

Well it didn’t take too long for the cycle-hating and victim-blaming to begin.

Scramjett

+1

Nix

I love my bikes, but I’m not one of those idiots who blames cars if I break the law while riding my bike.

If you are among those who believe that the laws don’t apply to bikes, hang up your bikes and get out of the sport. Do you cut new virgin downhills at your local mountain bike trail? Do you ignore red lights on the streets? Do you listen to headphones in both ears on multi-use bike/walk paths in parks, while averaging 20+ mph? If so, you might be one of the people that everybody over on the IMBA forum hate having to apologize for when they work with local authorities to improve bike access.

As bikers, WE have to be part of the solution, not part of the problem.

John in AA

You forgot I ask if I’ve stopped beating my wife.

Nix

Silly Johnny — That’s a whole lot of faux outrage for somebody who falsely accused me of “cycle-hating and victim-blaming” when I did no such thing…..

Look in the mirror, buddy. And let me know when you want to apologize for your “cycle-hating and victim-blaming” bullpuckey and I’ll let you know exactly when it was that you quit beating your wife.

peetah

i took a picture as proof that here article was bogus this past saturday when driving on Autopilot (AP1) and my car slammed on it’s breaks and showed a bike on the screen and followed it at 17-20mph until i took over…

Kosh

Well, to be fair. In California, that is now the LAW and what you’re supposed to do if you can’t give them 3 feet of space….

Scramjett

Interesting, didn’t know that was the law. Of course, that might be because almost no one put’s that into practice, and it’s not as if cops will do anything about it. Unless it’s a bicycle cop, they are almost always on the side of the driver.

Eric Cote

Not a bad video, though some nit-picks:
– The spelling/grammar errors are annoying (then vs. than, it’s vs. its) but I digress.
– They seem unwilling to declare a test a failure, like the last test, even though the car was not successful in passing the test.

Still, a good review of Autopilot HW 1.0 capabilities/concerns.

Pushmi-Pullyu
The Tesla Model S slowed down, and followed Mike. Once Mike moved over to the shoulder of the road, the car attempted to resume it’s [sic] course and speed. Unfortunately, at this point, the radar could no longer see the Bike and the Side Sonar did not appear to detect it quick enough before Driver took over, and could have resulted in a Minor hit. Thus illustrating the need for active scanning (radar or lidar) in 360°, not just forward-facing. Though, also note, We are testing using the old Autopilot 1.0 Hardware. Obviously (well, obvious to just about everyone except serial Tesla bashers) the functionality of Tesla’s Autopilot, AutoSteer and other semi-autonomus driving features is improving over time. So the utility of testing with old equipment seems questionable here. I can see the utility of testing it to compare it with the current(Autopilot hardware 2.0) system, but to test a hardware 1.0 system alone seems to have rather limited usefulness. We know that Autopilot has moved on from what was tested here. Autopilot 1.0 Sonar has a maximum range of 4ft (1.22 meters), where as Autopilot 2.0 hardware (What is being used currently in Tesla Production for Model S, X,… Read more »
John in AA

Nit, the short-range thingies are sonar, not radar. Sound waves, not microwaves.

Ben

If a cyclist is not driving on the middle of the road, but as far on the side as possible (as you tend to do as a cyclist) it wont overtake him with apropriate distance but instead might hit the cyclist in some situations. This gets obvious from the video.
In my opinion this is the worst case scenario. An autopiloted car overtaking a cyclist with no distance at all at full speed, in most cases deadly.

Nix

That absolutely is NOT the test results. In that situation the car slowed and followed the bicycle and got an unqualified “pass”

Asak

The tests were all done at very slow speeds. I wouldn’t get overconfident based on these results. At higher speeds you have a combination of less detection time and also longer slowing/stopping time.

Scramjett

Bingo. 9 out of 10 car vs bike collisions where the car’s speed is over 40 result in the death of a cyclist, regardless of whose fault it is and whether or not the cyclist was wearing a helmet. They did not test speeds of over 30, likely to ensure Mike’s safety. I’m curious if the Stanford study looked at high speed scenarios and that’s why they deemed autopilot deadly to cyclists.

John in AA

The Heather Knight article is almost physically painful to read, it gets so much wrong. I found myself wondering if she has ever previously rented a car. I am serious.

Hans

“could have resulted in a Minor hit”

It should be noted a minor hit is insanely dangerous for a cyclist; they lose balance, fall into the street and get run over.

Simon

As a bicyclist, I think this test is a bit too easy: difficult conditions would be during rain or sleet, in the darkness or in a busy city street.

I also think most comments here are getting overly frustrated about the clickbait rethoric of the Stanford blog post. This is a real problem, as you can read about here:

http://spectrum.ieee.org/transportation/self-driving/selfdriving-cars-have-a-bicycle-problem

(and which the video also shows in my opinion). Basically, it *is* harder to detect a bicycle than a big blocky car – and this isn’t in any way limited to the Tesla AP.

Now, as a bicycle commuter, am I worried about autonomous cars? No! Human drivers are notoriously unfocoused, and if anything, I think the situation should improve with a computer behind the wheel. However, I think the study shows that more effort is needed to make sure autonomous cars actually detect cyclists and pedestrians just as well as it detects cars.

Priusmaniac

Or with a cyclist passing full speed on a pedestrian passage like I witness at least once a month.

Priusmaniac

How does this go with a car or a truck or with unusual vehicles like a spaying tractor, a harvester or a bulldozer? Collision with those vehicles would this time pose a vital danger to the car occupants. What about a fallen tree or a road work gate?

Nix

Here is a very detailed analysis of what Heather Knight got wrong with her “test drive to the beach and back” analysis of the Model S:

This sums up where Knight went off the rails:

“Given that Autopilot IS NOT AN AUTONOMOUS DRIVING SYSTEM, calling it autonomous in the first paragraph and reviewing it as such is deeply irresponsible, if not foolish. For an academic in the field, it’s outrageous.”

http://www.thedrive.com/opinion/11068/when-stanford-roboticists-review-tesla-autopilot-they-dont-send-their-best

Sadly, Knight doesn’t have the academic honesty to simply withdraw her comments, and instead petulantly continues to defend her badly flawed analysis, while only correcting the most blatant and embarrassing of her multitude of errors.