Tesla’s Autopilot Camera Captures Pre-Crash Images


Tesla’s Autopilot camera may be able to be used like a dashcam, to capture footage leading to an accident as reported by Electrek. However, an aftermarket dashcam, made specifically for the purpose, would still do a much better job.

Tesla Autopilot Camera Footage

Tesla Autopilot Camera Footage

Jason Hughes, well-known in Tesla circles for hacking the tech in salvaged Tesla vehicles (and more recently for his massive solar home/ESS installation), posted some video on Twitter, showing Autopilot camera footage. The footage is black and white, and not super clear, making it difficult to tell exactly what is happening.

Hughes told Electrek that he was working with a salvaged Model S and found event data inside of the vehicle’s Media Center Unit. The data included multiple frames saved by the forward-facing camera prior to the crash. Unfortunately, Hughes (or anyone else looking through this data) cannot access all of the information, but some details are decipherable.

In his investigation, Hughes determined that the Traffic-Aware Cruise Control and Autosteer features were not on prior to the crash. He was also able to figure out the the car was traveling about 57 mph before the crash, but had no way of knowing if the Automatic Emergency Braking kicked in.

Hughes concluded that the frames were likely saved due to a trigger set off by airbag deployment.

The average Tesla driver would not be able to easily access Autopilot footage. In the event of some emergency or legal battle, an owner could surely contact the company and request stored information, but it’s not practical. Maybe down the road, automakers could make such data readily available to vehicle owners.

Electrek also clarified that the Tesla Autopilot camera is capable of high-definition, color recording, but the footage is converted to low-resolution, black and white format, because the car can’t process the data quickly.

Source: Twitter via Electrek

Categories: Tesla

Tags: ,

Leave a Reply

25 Comments on "Tesla’s Autopilot Camera Captures Pre-Crash Images"

newest oldest most voted

Also colors are not that useful when detecting objects. They add cost but not efficiency, that’s way pretty all object detecting implementations will pick greyscale, and more advanced computations, over colors and weaker algorithms.

Well, their RCC camera (grayscale in the red color channel) was very likely a contributing factor in the fatal Florida accident — as Tesla stated that the AP system couldn’t distinguish a white truck against a bright (blue) sky.

I would expect their next-gen (in-house) camera to be full color.

I would expect Tesla to stop trying to use the inferior system of optical object recognition to “map” obstacles for Autopilot, and move to an active scanning system such as radar or lidar. And when I say “radar”, I don’t mean the weak and extremely short-ranged radar systems in current use on Tesla cars. I mean something that can reliably detect moving cars and other obstacles out to at least 100 yards ahead of the vehicle, and 50 yards in other directions.

There is a reason why Google and Lyft self-driving cars use scanning lidar. Tesla’s current sensor systems just won’t cut it for truly autonomous driving.

Optical object recognition is what humans use for driving, and it is still superior to anything else out there, in any weather conditions. This is proof that it is only a matter of the right algorithms and data processing. Lidar would be defeated easily by snow, fog and rain, or bugs splattered on the lens. And while I am no tinfoil hat loonie or treehugger I am not sure if I really want every single car on the raod to beam powerful radar at me, in addition to all the other electromagnetic crap I am alreay exposed to.

Good points.

Actually a hybrid system would work best. Contrast AND phase-based optical detection systems are easily tricked by areas of low contrast like a white truck against a white sky. But if you could combine the best of both worlds…

mhpr262 said:

“Optical object recognition is what humans use for driving…”

Of all animals on Earth, humans have better visual acuity than nearly all others. (I think at least some raptors (hawks, falcons, eagles) beat us.) If and when computers are equipped with visual image processors as sophisticated as ours, then what you said there will become important… and not until then. Speaking as a computer programmer: Don’t hold your breath on that.

“…and it is still superior to anything else out there, in any weather conditions.”

Google’s engineers say the scanning lidar on their self-driving cars is significantly better at penetrating fog. I presume they know what they’re talking about.

Whoever was driving that Tesla sure deserved to lose it in an accident, rudely running a stale yellow light like that. Hopefully the person he hit did not have serious injuries. But I hope he sues the pants off of the a-hole Tesla owner, regardless.

There is no such thing under the law as a “stale yellow light”. It is entirely 100% legal for the Tesla driver to enter the intersection while the light is yellow.

It is people like you who mistakenly believe they have the right to illegally enter an intersection when the driver with the yellow light has the right of way that are the cause of accidents like this.

Agreed. Yellow lights are perfectly legal (haven’t seen footage)

The way traffic lights work in the US is so inferior to the UK. Our lights have to turn fully red for one direction before one of the other routes begins to change – sometimes with an additional delay.

This delay does two things, firstly it gives time for the junction to evacuate, and secondly it gives vehicles approaching at speed plenty of time to either stop or run the junction as it’s in the process of turning red.

Our lights also change to red + amber before going green which gives people waiting to enter the junction advance warning that the lights are about to change green and to check for hazards before proceeding.

In many parts of the US, that would just encourage even more left turning drivers to enter the intersection after the light turns red. In many places, people judge whether they can make a left turn based upon whether they think they can get through the intersection before traffic starts moving, not what color the light is.

It is sad, and illegal, but true. It is the reality where in many places the only time you can make a left hand turn is after the light turns red. It is hard to explain.

Classic left turn collision.

The light turns yellow, and the person making the left turn across traffic assumes that oncoming traffic have all applied their brakes as soon as the light turns yellow.

Meanwhile oncoming traffic sees the yellow light and calculates that they can legally enter the intersection before the light turns red, and legally proceeds through the intersection. (you can see in the pictures that the Tesla driver indeed did enter the intersection legally while the light was yellow — center light is on.)

The driver making the left turn fails to yield the right-of-way to the oncoming traffic and illegally enters the intersection on the yellow light and causes a collision.

If it goes to court in a jurisdiction where there is comparative fault, if the oncoming car was speeding, a judge may assign 0% to 49% of the blame on the speeding driver. But short of proof of speeds high enough to be count as “reckless”, the primary fault (51%+) will always be the driver who was making the illegal left turn.

I can’t imagine a judge doing anything but assigning 100% fault to the car making a left hand turn, at a red light, into oncoming traffic.

I wish I were technical enough to understand the tools that take advantage of what Jason has cracked. Hughes, himself, is at another level.

I need a camera.

Yeah, coming to the same unfortunate conclusion as well.

See my post above. A delay solves the problem. If you applied it to the video above, the person making a turn wouldn’t get a green light until AFTER the Tesla’s light had turned red, preventing the entire accident.

You are missing the point that the light turned from green to yellow in both directions at the same time. The Tesla driver had the right of way for as long as it was legal for him to enter the intersection. The driver coming from the opposite direction to make a left turn is obligated to wait until it is safe to do so. It’s not a situation where “all red” delays would have an impact.

Plenty of things could be said for the accident, but very little else to say about the EV aspect of this other than what Loveday reported on the AP tech involved in the video (since AP tech isn’t known to be involved in the crash), is there?

I wonder about the video records kept by other cars. I think the lane-keep feature of the Volt (and, presumably, other GM vehicles) uses some sort of optical analysis system.

Surprised automakers haven’t made “dashcams” a standard option at this point.

Would be nice to have something that was integrated into the car and would save the last 10 trips. Being able to save to a USB stick or transfer it over WiFi to a home computer would be nice.

The brand new Citroen C3, coming in October, is going to have a dashcam as standard, hooked to social networking options:

Great, so when you have crash your car uploads it to youtube?

What’s a ‘standard option’? Do you mean just standard? Option implies it’s optional; standard implies it’s baseline.

I find it funny that Tesla spends so much time with Autopilot but they cant just add a simple dash cam to all of their cars which would actually be extremely useful, 110% of the time

Agree. It would be a very simple and cheap option – just split the forward-facing camera feed and have an integrated but separate processing unit to store the video…