Autonomous Driving Deserves Our Support Or At Least Understanding

Tesla Autopilot


It will be years before self-driving cars are consistent and viable, but in the meantime, we should support adoption of the technology.

Some History:

Speed control for automobiles has been used since the 1900s. Even James Watt and Mathew Boulton used the technology in 1788 to control their steam engines. Modern cruise control was invented in 1948 by mechanical engineer Ralph Teetor out of frustration of traveling with his lawyer who would speed up and slow down when talking. Teetor’s system was first introduced in 1958 using a speed dial on the dash and was called, wait for it, “Auto pilot.”

We’ve been aware for some time of the advantages of cruise control. It helps prevent fatigue in driving, helps maintain a constant speed limit, thus also aiding in better fuel efficiency. A big disadvantage is that it can increase accidents by causing the driver to pay less attention. Yet, we have used the tool for 50+ years and it is either standard or an option on every vehicle today.

Vehicles with adaptive cruise control are considered Level 1 autonomous drive. This technology was first introduced in 1992 by Mitsubishi as a lidar-based distance control marketed as “distance warning.” This technology comes in a variety of technologies which vary in their application. In the most general terms, it is used to set the vehicle’s speed limit and to slow the vehicle based on that speed or controlled by the distance from the followed vehicle. There is a gray area with lane detection as to whether it moves onto Level 2. With Level 1, the vehicle is not capable of steering and controlling the speed at the same time.

Tesla Autopilot

When adaptive cruise control, steering control, lane detection and in some cases features capable of certain maneuvers such as lane changes are introduced, it is considered Level 2 autonomous driving.  There are and will be many versions offered by many manufacturers. The functionality will be achieved with many different technologies. The common thread is that they all are driver assist tools that require constant awareness by a driver. Though Level 2 might have the appearance of a hands-free experience where the vehicle is finally in control, this is never the case. Control of the vehicle is still the full responsibility of the individual and should be treated as an advanced form of cruise control.

Level 3 is the point where the vehicle begins to monitor its environment. At this level, the vehicle is STILL not in charge, though it will have become increasingly adaptive to more driving conditions, though not all.

Level 4 is the first point where the vehicle is in control of its environment but will be restricted to geofenced or restricted areas.

Level 5 is the point where the autonomous drive finally can handle every situation.

Waymo Jaguar I-PACE

As the technology advances from driver assist tools to full autonomy, the shift of responsibility shifts from the driver to, in many cases, the owner or manufacturer. The thing that some seem to want to aggressively change is the line of responsibility. That’s fine as long as it is done from an informed position. With a base understanding and a continuous exposure to fatalities, it is time to address the issue head-on.

We need to be Okay with autonomous vehicles that crash

According to the RAND Corporation Model of Automated Vehicle Safety, the result suggests that more lives will be saved the sooner autonomous vehicles are deployed. The report considered two models where it first evaluated the release of HAV, highly automated vehicles, which were only 10% safer than humans. The second model considers waiting to deploy HAVs when they are 75% – 90% safer than humans. The results show that more cumulative lives are saved with an early release. The study goes on to suggest that the cumulative lives saved could be as high as a half million on US roads alone. The final recommendation:

This evidence could help decisionmakers to better balance public skepticism with evidence about the human cost of different choices and to set policies that save more lives overall. Deploying HAVs when their safety performance is just better than that of the average human driver may be too permissive given social expectations about the safety of robots, machines, and other automated systems, but waiting for improvements many times over or waiting for perfection may be too costly. Instead, a middle ground of HAV performance requirement s may prove to save the most lives overall.

GM Produces Self-Driving Chevy Bolt EV Test Vehicles

If you have a fear of flying or play the lottery, then you probably don’t separate probability and possibility. For many, this could well add to public skepticism when it comes to autonomous vehicles. That is why it is important to get the studies and data out front. On the other hand, there are more vehicles entering the highways equipped with autonomous tools, where the drivers have no idea of what level of autonomous drive they actually have purchased. This should come as no surprise, for if we don’t understand the differences in levels, how can one possibly know what to expect in their driving experience? We needn’t argue over the difference between Level 3 and Level 2+, if that is even a thing, we simply need to be aware of the specific tools available to each vehicle and acutely aware of what they do and how to control them.

 1,250,000 people die each year in automobile accidents. That is an average of 3,287 per day. That is also equivalent to preventing a 9/11 attack every day, as well as preventing future attacks if we can solve it. Additionally, 20-50 million are injured or disabled in vehicle accidents each year with all accidents exceeding an annual cost of $500 billion dollars. For these reasons, the RAND recommendation deserves our attention.

While it was tragic to learn last week of yet another person killed by colliding with a tractor-trailer, we should not lose sight that they were one of 3,287 that died that day on roads globally. If Autopilot was engaged, it is still Level 2 and not in charge of its environment. No one knows how many years exactly we are from Level 5 autonomous driving. We do know that the next five years will see great advancements in Levels 1 through 3 and that, when used appropriately, studies suggest accidents will be prevented.

There will be skepticism as long as computer-related accidents occur, but we should not lose focus on the end result of lives saved. The exact number will be hard to calculate, but the elimination of pedestrian lives taken, the lives that are taken by speeding, the lives that are taken by drunk drivers, and lives taken by human distraction deserve our support. At a minimum, autonomous drive deserves our awareness of the data and our understanding of the technology.

Categories: Crashed EVs, EV Education, Tesla

Tags: , , , ,

Leave a Reply

33 Comments on "Autonomous Driving Deserves Our Support Or At Least Understanding"

newest oldest most voted

By end of 2021 Tesla should be able to make 500,000 vehicles each annualized on giga1 giga3 and Fremont and therefore make 1.5 million robo taxis replacing about 10 million sales of ice vehicles as each robo Tesla can drive 100k miles per year instead of the usual 15000 miles as for non self-driving vehicles.

your timeline sounds way to optimistic. True autonomy without a driver present is probably still a decade to go.
I would be happy if we get actual SAE level 3 vehicles in costumer hands by 2021.

Teslas won’t be robotaxis in 2021.

Musk expect feature completeness by end of 2019, much better than human driver by end of 2020. After that you need one more year to document it is super human and that is 2021. Hereafter authorities should approve by early 2022 and Tesla can make vehicles without any human driver controls that can only be operated fully autonomously.

Clearly you are talking about Elon time which is when Elon wants it complete by, not the actual time it takes to complete.

Tesla still hasn’t even done the cross country drive yet…

Musk’s made self-driving promises since 2013. Here’s a well-reasoned article by a Tesla owner who also happens to know a lot more about self-driving than you, me or Musk. Be sure to read the update he links to as well:

Yes, this is a fantastic article and worth the read. My two critiques are that he purchased the enhanced autopilot, not the full self-driving. He compares it to advanced cruise control which is correct but you can hear on multiple occasions where he has studied self-driving and speaks as if he is waiting for this to turn into self-driving. He clearly knows the difference and makes reference to people who want more out of it than it will ever be. I am of course referring to enhanced auto pilot. He makes another statement which I agree and that is his question that Tesla FSD will ever live up to true Level 5. I have my doubts about this as well, but we have to stop referring to enhanced auto pilot as full self driving. You will never get an automatic update from Tesla to do this without paying the difference. His entire review is on an advanced auto pilot which is advanced cruise control. As a regular user of the product, it is really advanced cruise control and a joy to use, but you are kidding yourself if you are treating it as hand-free and we hurt the advancement of… Read more »

Not so sure a Tesla robotaxi makes sense…It’ll be just an “appliance” so having a chinese built passenger van makes more sense…

Robotaxi needs easy ingress/egress, package loading, swivel seating and easy-clean surfaces.

I believe that full autonomous cars is close to impossible. It is true that autonomy can do better in areas where humans are weak. It is also true that autonomy can do poorly in areas where humans are strong.
Also we have to remember that computers act up some times with no warning, like freezing up and glitches. This can happen in dangerous situations.

With driving there are too much variables to deal with, but I think the technological boundaries should still be pushed. This will result with a better product to assist drivers. I think the two (driver and autonomy) will be more practical. Where both can be applied in areas where they work best.

Full autonomy I believe is possible with transportations like airplane and ships where there are little or no variable

Just my opinion

I cannot care less about autonomous in my car, I find the concept mostly stupid.For trucks or uber that would be another matter.

I won’t dismiss your opinion, I think it’s valid and if we don’t appreciate that many folks have an aversion to something they’ve never used, then autonomy will feel like it’s being shoved down the throat of the masses. That said, if you think about all the times when you wish you didn’t have to worry about driving, like when you’re wiped out on a long road trip, or maybe when needing to tie someone up to drop you off/pick you up at a destination, then the option of autonomy may sound pretty attractive. At least pretty nice to have as a back-up option. Just a thought.

I have no use of it, and I m pretty sure I would not be comfortable in a fully autonomous car as I quite unease in a plane if I do not pilot it.
Anyway so far FSD (especially Tesla) has been a scam

It will be life-changing for many people who cannot drive anymore but still want to participate in society without having to completely depend on others.

FSD enhances safety whether used or not. The vehicle could still give warnings or even take over in dangerous situations. AEB already does this.

Personally, I look forward to the day when I don’t have to drive. I already don’t own an encyclopedia on my shelf or a land-line phone.

I have a 2014 Nissan/Infiniti Q50 with the level 2 driving features, its already amazing just to have the driver assist. I already rely on it for my daily commute, and I wouldn’t ever buy another car without it. These features will get better and better until they are indistinguishable from self driving, it may not get to claim that its full self driving, but soon enough cars will be capable of everything but taking responsibility!

“We need to be Okay with autonomous vehicles that crash”

No we don’t. If FAD were introduced tomorrow, most of the accidents would be caused by construction, poorly marked roads and other such issues. Right now, construction crews rely on cars being driven by humans. FAD would require road crews to have access to computer update systems that can mark problem areas and warn people in advance that they will have to take control. It will also require that automated vehicles tour streets and check for proper markings, just as right now the rail companies have “checker” cars that measure the quality of rails. FAD cars are going to have to be limited to only properly checked streets.

We are a whole infrastructure away from FAD. No computer is going to be able to solve all the issues.

That is an important point, because autonomous drive systems seem to depend heavily on reading road markings, signals, and signs. Yet there is very little discussion on the markings and maintenance aspects of roads when it comes to autonomy.

Perhaps we come to a point where autonomy is only enabled on roads that meet marking standards based on known inspections. His will have a major impact on state and municipal road maintenance. It also could have a dramatic impact on the successfl implementation of autonomous driving.

Also, if you look at the data over 90% of fatalities are in low to middle income countries. I’m guessing much of that has to do with the poor infrastructure. If you have ever visited one of these countries you’ll see what I mean.

I disagree. AI has already proven human superiority in narrow cases. It is only a matter of time until an AI can drive a car competently. Using existing infrastructure.

First, robocars will be geofenced to narrow the calculations. Eventually they will be better than the best human drivers in all situations.

Looking at the source for this data i found this tidbit;
“Over 90% of all road fatalities occur in low and middle-income countries, which have less than half of the world’s vehicles.”

Shouldn’t the focus be there if reducing deaths is purpose for automated driving?

Much more work needs to be done with cars interactively communicating with each other. Not just automated control of just one’s solo vehicle.

Airplanes are a good example of this, autopilot works well , but the addition of TCAS (collision avoidance) has saved hundreds of lives in the air.

I have no problem with self driving be developed and worked on.

I have problem with overly aggressive claims and misrepresentation of the data.

The entire spin about using one or two system to compare against the overall accident or death rate is simply foolish.

There are at least 9 models that have 0 fatality rate according to IIHS study. If we use those 9 models to spin the rate, then we could have said that by driving those models/brands, they are FAR SAFER than any self driving system out so we don’t need to develop them. Instead we just need to drive those 9 models. That would be a fallacy. So, please don’t use those “selective data” to justify whether the system is ready or not. Let us look at its true capability for all situation as the determining factor.

Climate discussions are proof enough that studies and data will not change the actions or beliefs of a very wide swath of the public, and in many cases challenging them on it has the reverse impact… forcing them to dig their heels in deeper.

Self driving cars right now are a mere curiosity. If/when the discussion changes to that of wide spread adoption… or even worse forced adoption… I expect the whiplash effect of negative publicity and comments to become wildly overwhelming.

The WHOLE purpose of driving for, all true petrolheads, is to hurtle yourself over the surface of our Earth and put Isaac Newton in the driving seat. THAT is the joy of driving.

An AI-unit will not allow for that individual freedom and expression. If you’re into autonomy for cars, you should ride the bus as a daily driver.

Until there is a bus that goes where I go, it won’t happen.

Driving in a commuter situation rarely arrives at the point of exceeding Neutonian physics. Unless you crash. Driving an HP car is exceedingly frustrating under these conditions.

A point and shoot car would be handy.

Are there any studies on the possible negative effects of autonomous cars wrt to extra traffic? If an autonomous car relieves the driver of needing to focus on the road and they can do work instead, or watch TV, or whatever then I can absolutely see a scenario where more cars are on the road because traffic isn’t as much an issue. While nobody enjoys sitting in it, if you can leave a half hour earlier from work or half hour later to work because you’re doing work in the car I could definitely see people opting for their own car vs say one of the corporate shuttles or train options. Single use cars, even if they’re EV, are still wildly inefficient and we should not be doing things to encourage it imo. It still takes energy to fuel it, to make it, and until decades from now when all cars are EV it makes the gas cars sit in traffic longer too. So while you are not causing local pollution, you and the 100 others like you who are in a single occupancy car vs a train car are making things net worse because the gas cars are generating… Read more »