Result was obtained via video of the scene

The tragedy of the first pedestrian death by an "autonomous" vehicle is only a few days behind us. While many individual voices have been raised in discussion of the incident — we have an on-going, wide-ranging discussion on the InsideEVs Forum — other suppliers of autonomous vehicle systems are now chiming in. One of them is Mobileye.

Images from a video feed watching a TV monitor showing the clip released by the police.

Images from a video feed watching a TV monitor showing the clip released by the police.

Once partnered with Tesla in its autonomous program, the Intel-owned outfit has released an editorial (which you can read in full below) that talks up the result of an experiment they carried out. Employing the advanced driver assistance systems (ADAS) software already found in many modern cars, they scanned the video via a monitor. Despite missing a fair amount of high dynamic visual information, their system was still able to detect the pedestrian, Elaine Herzberg, a full second before impact.

Read Also: Public Split Between Tesla And Autopilot Chip Provider Mobileye Gets Messy

Reportedly moving at 38 miles per hour, that would have given the car 56.7 feet in which to attempt to stop. It seems quite probable, had it been installed in this vehicle, Miss Herzberg would be with us today. Of course, if Uber hadn't, reportedly, turned off the safety system in the Volvo XC90 PHEV, it's quite likely it too would have detected the woman walking her bicycle across the road. It also goes without saying that if the car's operator had of not had his eyes off the road for a full five seconds before impact, there is a chance this fatality could have been avoided.

Besides discussing the importance of legacy ADAS software and how it works, the editorial, written by senior VP of Intel and Mobileye's CEO and CTO, Amnon Shashua, also discusses the need for redundancy built into systems, and outlines how Mobileye achieves "true redundancy" by having a "separate end-to-end camera-only system and a separate LIDAR and radar-only system."

While we lack the technical expertise to fully evaluate the quality of the Mobileye system, we can get behind the final portion of the editorial in which he calls for a convening of "automakers, technology companies in the field, regulators and other interested parties" for a discussion of a safety validation framework for autonomous vehicles.

Here's the press release from Intel, owner of Mobileye:

Now Is the Time for Substantive Conversations about Safety for Autonomous Vehicles

Society expects autonomous vehicles to be held to a higher standard than human drivers. Following the tragic death of Elaine Herzberg after being hit last week by a self-driving Uber car operating in autonomous mode in Arizona, it feels like the right moment to make a few observations around the meaning of safety with respect to sensing and decision-making.

First, the challenge of interpreting sensor information. The video released by the police seems to demonstrate that even the most basic building block of an autonomous vehicle system, the ability to detect and classify objects, is a challenging task. Yet this capability is at the core of today’s advanced driver assistance systems (ADAS), which include features such as automatic emergency braking (AEB) and lane keeping support. It is the high-accuracy sensing systems inside ADAS that are saving lives today, proven over billions of miles driven. It is this same technology that is required, before tackling even tougher challenges, as a foundational element of fully autonomous vehicles of the future.

To demonstrate the power and sophistication of today’s ADAS technology, we ran our software on a video feed coming from a TV monitor running the police video of the incident. Despite the suboptimal conditions, where much of the high dynamic range data that would be present in the actual scene was likely lost, clear detection was achieved approximately one second before impact. The images below show three snapshots with bounding box detections on the bicycle and Ms. Herzberg. The detections come from two separate sources: pattern recognition, which generates the bounding boxes, and a “free-space” detection module, which generates the horizontal graph where the red color section indicates a “road user” is present above the line. A third module separates objects from the roadway using structure from motion – in technical terms: “plane + parallax.” This validates the 3D presence of the detected object that had a low confidence as depicted by “fcvValid: Low,” which is displayed in the upper left side of the screen. This low confidence occurred because of the missing information normally available in a production vehicle and the low-quality imaging setup from taking a video of a video from a dash-cam that was subjected to some unknown downsampling. The software being used for this experiment is the same as included in today’s ADAS-equipped vehicles, which have been proven over billions of miles in the hands of consumers.

Images from a video feed watching a TV monitor showing the clip released by the police. The overlaid graphics show the Mobileye ADAS system response. The green and white bounding boxes are outputs from the bicycle and pedestrian detection modules. The horizontal graph shows the boundary between the roadway and physical obstacles, which we call “free-space”.

Recent developments in artificial intelligence, like deep neural networks, have led many to believe that it is now easy to develop a highly accurate object detection system and that the decade-plus experience of incumbent computer vision experts should be discounted. This dynamic has led to many new entrants in the field. While these techniques are helpful, the legacy of identifying and closing hundreds of corner cases, annotating data sets of tens of millions of miles, and going through challenging preproduction validation tests on dozens of production ADAS programs, cannot be skipped. Experience counts, particularly in safety-critical areas.

The second observation is about transparency. Everyone says that “safety is our most important consideration,” but we believe that to gain public trust, we must be more transparent about the meaning of this statement. As I stated in October, when Mobileye released the formal model of Responsible Sensitive Safety (RSS), decision-making must comply with the common sense of human judgement. We laid out a mathematical formalism of common sense notions such as “dangerous situation” and “proper response” and built a system to mathematically guarantee compliance to these definitions.

The third observation is about redundancy. True redundancy of the perception system must rely on independent sources of information: camera, radar and LIDAR. Fusing them together is good for comfort of driving but is bad for safety. At Mobileye, to really show that we obtain true redundancy, we build a separate end-to-end camera-only system and a separate LIDAR and radar-only system.

More incidents like the one last week could do further harm to already fragile consumer trust and spur reactive regulation that could stifle this important work. As I stated during the introduction of RSS, I firmly believe the time to have a meaningful discussion on a safety validation framework for fully autonomous vehicles is now. We invite automakers, technology companies in the field, regulators and other interested parties to convene so we can solve these important issues together.

Professor Amnon Shashua is senior vice president at Intel Corporation and the chief executive officer and chief technology officer of Mobileye, an Intel company.

Source: Intel