Autonomous Chevrolet Bolt Keeps Getting Hit By Human-Operated Vehicles

Chevrolet Bolt


Chevrolet Bolt

General Motors Chairman and CEO Mary Barra shows autonomous Chevrolet Bolt outfitted by Cruise Automation.

According to GM, its fleet of self-driving Chevrolet Bolts has more than doubled, which means more accidents … but fortunately, the Bolt has not been at fault.

GM’s Cruise Automation has increased the number of autonomous Bolts testing on California roads to 100 over the last three months. Prior to this ramp up, the company was only testing 30 to 40 self-driving units. Now that there are so many robo-Bolts on the road, there have been increased reports of minor crashes, all of which were caused by humans operating cars and bicycles. GM Cruise spokeswoman, Rebecca Mark, assured:

Autonomous Chevy Bolt EV out testing in San Francisco (via Glenn L)

“All our incidents this year were caused by the other vehicle.”

Just over the course of September, the Bolts have been involved in six minor incidents, none of which they caused. The tests are taking place on the busy roads of San Fransico, in order to prepare the self-driving vehicles for real-world situations and urban stop-and-go traffic.

The accident situation is something that we also saw early on when Google was testing prototypes. Just because these cars use artificial intelligence and are programmed not to “hit” people, cars, or bikes, among other things, this doesn’t mean that they are accident-free. In fact, since many humans don’t obey traffic laws, aren’t used to the robo-vehicles, and often make errors, accidents are likely. Though it seems that most are minor in nature, and no one has been hurt.

The crashes have all been described as a result of other drivers running into the self-driving Bolts while they are slowing down for stop signs or pedestrians. People are impatient with the autonomous cars and attempt to hurry around them or don’t want to come to a complete stop. This is not unlike the way we see drivers act around driver’s training cars. Other accidents were the product of people using their devices and not paying attention to the road in front of them. A final incident involved the Bolt stopping for a drunk bicyclist, but the biker still ran into the Bolt’s bumper while it was stationary. GM concluded in a recent statement:

“While we look forward to the day when autonomous vehicles are commonplace, the streets we drive on today are not so simple, and we will continue to learn how humans drive and improve how we share the road together.” 

For all of 2017, the Cruise Bolts have been in 13 reported collisions (California requires that all crashes are reported regardless of severity). None have resulted in any serious damage to the vehicles or human injuries. Meanwhile, Alphabet’s Waymo self-driving cars have been involved in three crashes. Nonetheless, Automotive News says that investors are watching GM’s progress and following reports and they must be impressed because shares have increased by 17 percent in the past month.

Source: Automotive News

Categories: Chevrolet

Tags: ,

Leave a Reply

56 Comments on "Autonomous Chevrolet Bolt Keeps Getting Hit By Human-Operated Vehicles"

newest oldest most voted

Drinking and biking…I guess that’s actually a thing. Lol

A decade ago in NH, someone was arrested for it. It received a lot of press at the time because I think he was an achor or weatherman for one of the NH TV News stations.

I guess that’s a BWI? 🙂

“All our incidents were caused by other people” this is like all the other “truths” GM The Stealership & Fossil Fuel Advocate, has told the public in the Past.If you believe this , You’ll believe anything .

And of course the TSLA fanboi/GM trolls eventually show up.

L’amata — All the accidents were well documented. The Bolt was at a stop and wasn’t at fault. You are a relatively new poster, so you might not have had a chance to read all the stories, but they are all in the archives if you want to confirm.

I don’t think he’s a relatively new poster. I think he’s “EV nut” under the latest of his many pseudonyms.

So you think this is fake news, eh?

Judging from your strong reaction, I’d say you were probably the drunk guy on the bike. 😉

In SF? Your car will get hit if you simply park there. People here cannot drive.

Some of the streets are just impossible to drive on so i can see how that could happen.

No reason for GM to lie about this as they are still learning about the tech…and this has nothing to do with their distribution network.

Drinking and everything is a thing.

In the Netherlands we also call a bike “the beer car”.
That’s how we got to and from the bars.

Actually a lot of people are having an old beer bike especially for this reason. One you don’t mind too much if it’s getting stolen our if you crash it.
Me and my mates all had one In our teenage drinking years.

Tesla & Others self driving AI Software, internal analysis (Conversation):

“What’s that Wobbly thing ahead?”

“I don’t know! It doesn’t fit my object look up table!”

“I know, it’s a ‘Beer Bike’! I remember reading a blog about that, in the comments!”

“You read it? Who said you could do that?”

“I just read it! Nobody told me not to!”

“Auto 5 …. Is Alive!”

Letting your fledgling AI read the comments is how you get a Skynet. You don’t want a Skynet, do you? Don’t let your fledgling AI read the comments.

You are so right. AI and the internet don’t mix:

😆 😆 😆

Robert, thanks for that; you made my day!

“investors are watching GM’s progress and following reports and they must be impressed because shares have increased by 17 percent in the past month.” I’d take a Vegas bet the overwhelming majority of people who invest into GM stocks do not even know a semi-self driving Bolt even exists…

Only need a few big-time hedge fund investors to buy into the stock to get it moving.

Well, I’m sure you know far more about manipulating stock prices than most of us. After all, that’s the only reason you’re here.

How are human operated Bolts doing? It seems 6% accident rate is incredibly high.

You can always play games with “legal driving” to cause accidents which isn’t your fault. This is how scammers used to sucker people into hitting scammers’ crappy car only to bilk insurance for large sums of money. Some driving habits, while legal, can promote more accidents. That’s true with autonomous or human.

You would also need to take data only from highly congested areas like San Francisco, otherwise your control data isn’t legitimate.

Some controls are fine, though I’d like to see all data. If you tighten the controls enough that the human is driving exactly like the robot, of course they’ll have the same stat.

I’m saying you need to compare apple to apples.

True, but unlike apples, you can tune your criteria to any degree, thus making them identical and meaningless. One has to be careful in deciding what is reasonable level of control.

The control needs to be the EXACT same experiment but with just one variable different.
That’s the definition of a control in an experiment. If you’re going to look at crash data of autonomous Bolt EV driving around San Francisco, then you need to compare it to human-driven Bolt EV around San Francisco. That is the only comparison that makes sense. Comparing it to drivers in BF-nowhere Montana would be pointless.

Only if the real world is so simple, we would have perfect autonomous driving by now. Trying to identify EXACT parameter is going to take you the age of the universe. At some point, you have to make a decision as to what is good enough for comparison sake.

I believe these are being driven a LOT more than a standard personal vehicle, so what you need to compare is the number of accidents per mile of city driving.

“How are human operated Bolts doing? It seems 6% accident rate is incredibly high.”

I have had my bolt for 10 months (since Jan). Got rear ended in the bay area this week by an idiot. Not hard to believe at all.

“How are human operated Bolts doing? It seems 6% accident rate is incredibly high.”

That’s a meaningless statistic, completely lacking in context. We’d need to know the accident rate over X number of miles to have any basis for comparison. However, that said, it does seem to suggest an accident rate higher than what Waymo is getting.

Reading between the lines of the article, I wonder if GM’s cars are slowing and stopping too suddenly when they approach a stop light or stop sign. They could legally be “in the right” by slowing or stopping suddenly, yet still not be driving safely.

What’s the 6% figure? Are you saying 6% of the vehicles got in a wreck in a month or something?

If you want meaningful figures you have to go by operating hours or miles driven. Otherwise vehicles which are driven more will appear to be driven less safely.

Will autonomous cars ever be capable of questionable evasive maneuvers?
Say a drunk SUV driver turns illegally down a one way the wrong way at night.
The SUV moving fast towards an autonomous BoltEV. Will the BoltEV be able to get out of the way by driving into someone’s driveway or lawn?

Volt#671 + BoltEV

Well, someday that drunk will also be in a self-driving car and the situation won’t happen.


??????. No it will stop and wait to get hit. AI cars are not good

“The SUV moving fast towards an autonomous BoltEV. Will the BoltEV be able to get out of the way by driving into someone’s driveway or lawn?”

There will always be unusual situations where a self-driving car would not react as creatively as a human driver. Fortunately, software engineers are not going to be required to program their cars to act that creatively.

“The thing to keep in mind is that self-driving cars don’t have to be perfect to change the world. They just have to be better than human beings.” — Deepak Ahuja, CFO of Tesla Inc.

But to address your specific case: Perhaps the autonomous car would do better to avoid the reckless driver by veering to the other side, rather than off the street, so long as there is no other oncoming traffic at the spot.

Make me think of the movie “Total Recall”. Where the cabs are all robots. “You’re in a Jonny Cab”

It will not be too much longer perhaps all Uber, Lyft etc will be Automomous.

Soooo, when will autonomous driving software acquire the capability of “defensive driving”?

You mean with missiles? 🙂

(⌐■_■) Trollnonymous

No silly, with a MAC10 or UZI!.

They don’t work well against cars, caliber being so small (10mm and 9mm respectively). Also, the recoil on them will slow you down, making even more demands on car’s AI. Best is “set it and forget it” missiles.

Just the other day, I wanted to practice such defensive driving, but unfortunately, missiles did not come with SparkEV.

Reminds me of Alan Dean Foster’s short story “Why Johnny Can’t Speed”, the story that inspired the game “Car Wars”.

The punch line is “Be safe. Drive Offensively.”


Your insurance rates will rise no matter who is at fault.

I’m curious to see if and how much autonomous cars will be programmed to do something illegal because it is the only option, safer or more social.

Imagine for example a highway with a speed limit of 30mph but everyone is driving 60. Or (in Europe, where passing is only allowed on the left) when a car to your left is driving dangerously slow, yet you are by law not allowed to overtake him …

Plenty of situations can be thought off where other people force you to make illegal calls.

Instead of doing something illegal to address the issue of a misbehaving vehicle, when will they be able to alert the appropriate police (local, regional) to come intervene? That could be interesting.

This highlights the issue of mixing human drivers w/autonomous drivers. Humans suck at driving, so until that part is removed, I think we will still need the ability to override the autonomous features to get by crappy human drivers.

GM operates their cars in the chaos of San Francisco. That explains most of these accidents. GM went hardcore, and they will reap the benefits of this kind of advanced training.

I was about to type something similar.

These are being used in a major city center and are being used by actual employees as shuttles not only for work but for grocery trips, nights on the town, etc.

Rather than traditional self driving test cars that might be tested in a corporate campus, around quiet neighborhoods, or in purpose built mock cities intended to replicate actual roads.

I know how they feel My bolt got rear ended while stopped turning right on monday.

Its like I always say to people, “look in the direction you are driving”. Arrggg.

“A final incident involved the Bolt stopping for a drunk bicyclist, but the biker still ran into the Bolt’s bumper while it was stationary.”

There, that’s all the proof we need that self-driving cars are too dangerous to be let out in the wild!
😀 😀 😀

Here’s a very relevant and interesting blog post that Cruise’s CEO posted last week. It includes some interesting statistic tables and some cool short videos showing the autonomous Bolts navigating by themselves through 6-way SF intersections with broken traffic lights and through a construction zone following sign and hand signal directions from humans.

Very interesting article.

JFK would be proud that they are doing testing in SF “not because it is easy… but because it is hard!”

Impatience, ignorance, imprudence: the 3 deadly “I’s” on American roads. Not a new thing but a constant human action as to anything that moves from skates to planes. Cause? Driving Anxiety Syndrome (DAS). How to tell if you have DAS? Easy. Driving without a schedule, speeding, tailgating and superb selfish attitude. Most have it but are not aware. Insidious and evil, the cause of all accidents! Best watch out self driving car programmers!

That seems like a lot of accidents for 60-100 cars. I mean, if that same ratio applied to all cars… well, that’s about 6 percent of the autonomous Bolts. There are about 253 million cars on the roads in America. So if the Robot Bolt car accident ratio applied to all cars, that would mean over 15 million accidents. Something’s not right.

Up until last August, when I lived in Mountain View, I noticed that the google cars would drive very slow, well below the speed limit, and if you turned towards them, say, to start a left hand turn (as they we’re coming in the opposite lane), they would just stop right in their tracks. It seemed to me that they were more the cause of driver frustration and I would imagine they are more the cause of these accidents than what this article is reporting.