UPDATE: Watch Tesla Autopilot Tested With Tape Covering Cameras

MAR 21 2018 BY STEVEN LOVEDAY 17

Which Tesla Autopilot cameras are activated, and can the car drive itself with multiple cameras rendered useless?

“Hold on while I blindfold my Tesla!”

***UPDATE – March 21: This post obviously piqued another Tesla owner’s interest and skepticism, so he did his own experiment, which attempts to “debunk” this story. Check out the video at the bottom of this post, with different results. Sadly, it’s not very easy to watch due to the camera work, but the point comes across just fine.

YouTuber and Tesla Model S owner Brian Jenkins (AKA i1Tesla) decided to have a little fun with Autopilot cameras and tape. Not knowing for sure which cameras are even activated at this point, as well as specifics about the car’s ability (or inability) to function in Autopilot mode with various cameras incapacitated, he felt the need to perform an experiment.

RELATED: UPDATED TESLA AUTOPILOT NOW BEING BETA TESTED BY OWNERS

Tesla Autopilot with TapeAs you can see from the video, Jenkins used somewhat of a system of covering and uncovering various cameras, then initiating Autopilot. Interestingly, the Model S will let Autopilot control the car with most cameras covered.

Unfortunately, he didn’t check to see what happens when every single camera is covered. We really wish he had because there’s a chance the car would have still initiated the self-driving feature, although it would have driven very badly.

What does this all tell us?

Well, you can look at it in a few different ways. It’s nice to know that if rain or debris blocked a camera, your car would rely on the others and keep doing its thing. However, perhaps you wouldn’t want it to keep on keepin’ on since there’s a chance it wouldn’t work as safely …

Moreover, if this is simply a glitch … meaning the car should not allow Autopilot to engage with all of these cameras covered, but the system doesn’t understand that this is a huge problem, then there’s big trouble here. As you can see near the end – when the Model S is driving itself and only the side-pillar cameras are “seeing” – it’s all over the road and may have crashed if Jenkins didn’t take over.

With all of these cameras activated and seemingly working as their supposed to, reports are saying that the system is drastically improved. The newly-activated cameras on the pillars have reportedly cut down on the “ping-ponging” issue.

Hopefully, the system is much safer now and continues to improve. However, we can’t say we’re not a bit concerned that Autopilot will still attempt to drive the car when it can’t see! We surely wouldn’t choose a blind chauffeur.

Video Description via i1Tesla on YouTube:

I taped up the cameras to see what would happen. Found out something very cool. All the cameras seem to be on and operational.

2017 Model S 75D with AP2

2018.10.4

Check out the new “debunking” video below (Hat tip to scottf200!):

Video Description via daudpechler on YouTube:

Quick edit. After watching a video in which someone covered up the front windscreen camera-unit, side repeaters and only left the B-pillar cams totally uncovered on his AP2-Tesla, I felt like putting the line-detection capabilities of the AP2.0 hardware and software to the test myself. Debunked the assumption that autosteer can be activated with the front unit covered and totally relying on B-pillar camera’s for line-detection. However, Autosteer can still be activated when the 3 front camera’s are partially obstructed in their view. Although, output by the system becomes very poor, as demonstrated in this video. Region: The Netherlands, EU

Source: Teslarati

Categories: Tesla, Videos

Tags:

Leave a Reply

17 Comments on "UPDATE: Watch Tesla Autopilot Tested With Tape Covering Cameras"

newest oldest most voted

I had no doubt it has built in redundancy.

Impressive regardless.

The above video is not accurate. Other videos out prove that if you cover the entire set of cameras in the front then AP cannot be engaged. Still, the other video did so what if you cover most like the above video it finds a way.

Other video with better testing — https://www.youtube.com/watch?v=yKgBakJc8ZY

Thanks for the share! Updated the post with this. Hat tip!

Better testing:
Tesla Autopilot 2, How many cameras does it use? Covering them with tape!
https://youtu.be/Pm1S49THsH4

Redundant means the failover system is as good as the first system. These aren’t redundant.

It’s scary that it allows autopilot to engage with only the pillar cameras activated. All it can see at that point is the sides of the road plus radar and ultrasonic data. That’s why the split in the road caused problems.

File this article under “Hold my Beer”. Tesla, with all eyes open is underpowered sensor wise. As many in the industry are likely to find out painfully after what happened at Uber yesterday, it is not just redundancy or even safety in % that they need. The airline industry learnt some painful lessons in the 1970s when they realized that the impressions of a nervous flier counted more than vague pronouncements about accident rates being lower than for cars, etc. If a video surfaced of a pilot messing around with sensors even on a private aircraft, the FAA would be after them in no time. I was part of the first wave of autonomous tech in the 1990s that died on the vine when questions of liability could not be answered satisfactorily. It looks like the new crop of cowboys are risking yet another public backlash against AV tech. Please stop posting such videos and encouraging the nutjobs.

(⌐■_■) Trollnonymous

Didn’t that car use LIDAR?

It makes this worse. Even with the range capabilities of LIDAR, the software for such technology is only barely finding its footing.

The public is not convinced about the safety of AVs in general, yet. If these cowboys are out cruising the roads with essentially what is an open loop control circuit, the pitchforks will be out for everyone.

How exactly are we supposed to avoid someone jumping in front of the moving car? Robot or human, that situation plays out many times a day and the result is always the same no matter the driver.

Now after seeing the crash video, which proves the lady was at fault, i have serious questions about Uber AP. I was under the impression that it uses lidar and it is superior compared to a camera based AP such as Teslas. It turns out lidar is as blind in the dark as we are….very confused by this accident.

I have doubts about this article. I’ve owned 2 Teslas Model S. At one point one of the cameras had an intermittent short and the symptom was the car refusing to engage AP.

A short is a different situation since in your case the hardware wasn’t functioning correctly. Here the hardware worked fine the image was just blocked.

I still remain very skeptical. My Tesla will refuse to engage AP if I am on an unlined road or a road where the lines are obscured and unreadable. So it’s very hard for me to believe that with completely blacked out cameras that AP would engage. Until this story is corroborated, I am strongly in the skeptics column.

How safe is this for people and vehicles around? It is illegal for a blind person to drive a car but perfectly legal for the blind autopilot. I hope the owners understand that according to the present legislation they will be held responsible for any sort of incidents they might cause.

It looks like Tesla is playing with fire here. By doing such irresponsible things they risk to attract a lot of attention from the regulatory bodies (with little help from competitors).

It is now possible to use Oranges, Tape, and Beavis buddies in AP Testing! Or, Tesla puts New Sensors (Extra Radar & Cameras) in More places, and a few ‘Dummies’, wired in, just for kicks, for Buttheads like this!