First US Public Self-Driving Shuttle Launches In Las Vegas, In Accident Shortly Thereafter – Video

1 week ago by Mark Kane 47

Probably not the best kick-off news cycle for an autonomous debut.

America’s first public self-driving shuttle has launched in downtown Las Vegas using NAVYA Arma vehicle –  supplied from Michigan where 25 are to be build for North America customers by the end of 2017.

AAA And Keolis Launch Nation’s First Public Self-Driving NAVYA Arma Shuttle In Downtown Las Vegas

The shuttle pilot project will enable a quarter-million residents and visitors to Las Vegas a first-hand experience using autonomous vehicle over the course of a year.

Unfortunately, the shuttle was involved in a minor accident about an hour after it launched, when a truck backed into it (the shuttle was not found at fault, but probably could have done a better job avoiding this accident – more/video on that below).


“Look Ma, No Driver”


It’s pretty cool to see the first all-electric, autonomous vehicles on the road (in this case being a fixed routes and just low-speed driving for now), although it will take some time for the technology to match a human driver – at a reasonable cost.

“In addition to studying how the shuttle interacts in a live traffic environment in downtown Las Vegas’ busy Innovation District, AAA will survey riders on their experience in order to understand why a large percentage of consumers remain wary of driverless technology, and whether a personal experience changes their perception. AAA partnered with the city of Las Vegas, the Regional Transportation Commission of Southern Nevada (RTC) and Keolis North America (Keolis), which will operate and maintain the NAVYA Arma fully electric shuttle.”

“The shuttle is manufactured by NAVYA, comes equipped with LiDAR technology, GPS, cameras, and will seat eight passengers with seatbelts. Safety features include the ability to automatically and immediately brake in the event of a pedestrian crossing in the path of the vehicle. In addition to surveying the shuttle’s riders, AAA will examine how others sharing the streets react to it — including pedestrians and cyclists.  AAA chose Las Vegas for the launch because of the state’s progressive regulations on autonomous vehicles, heavy investment in innovation, the high volume of visitors and a sunny, dry climate that’s favorable for testing new driving technology.”

As noted, the autonomous shuttle had a minor accident on its first day of use, when a semi-truck backed into it.   It was reported that the shuttle stopped when it noted the truck, but the semi continued to reverse…ultimately right into the side the passenger vehicle, unaware it was in its path.

While the truck was cited at fault, the first question one thinks of in this situation (as did one of the passengers recounting the event in the news video below) is: “would an aware human driver not only have noticed the truck, but taken further corrective actions to avoid it?  Or at least peeped the horn?”  The answer of course being – yes.

A human operator was also in the vehicle to oversee operations (and maybe giving a live tour narrative), but was unable to intervene to avoid the fender-bender.  Clearly some further kinks need to be worked out.

Video (below): Further details, and an ‘on the scene’ response from a representative of the shuttle service from Keolis

How the Self-Driving Shuttle Pilot Program Works

Covering a 0.6-mile loop in the Fremont East “Innovation District” of downtown Las Vegas, the all-electric, self-driving shuttle offers free rides for people to experience autonomous transportation in a real-world environment. The shuttle is the country’s first autonomous shuttle to be fully integrated with “smart-city” infrastructure, communicating with traffic signals to improve safety and traffic flow. The shuttle is operated and maintained by Keolis, which also led the efforts to integrate its vehicle into the smart-city infrastructure, in partnership with the city of Las Vegas and NAVYA.

The shuttle can be boarded at any of the autonomous-vehicle shuttle’s three stops located on Fremont Street and Carson Street between Las Vegas Boulevard and 8th Street.

AAA is proud to work with the Las Vegas community on this program. The auto club will donate $1 per passenger during the pilot program for a minimum donation of $100,000 to the Las Vegas Victims’ Fund and its efforts to support the needs of people impacted by the Las Vegas mass shooting October 1, 2017.

Tags: , , , , , , ,

47 responses to "First US Public Self-Driving Shuttle Launches In Las Vegas, In Accident Shortly Thereafter – Video"

  1. God/Bacardi says:

    There was a driver who was present who could take over…

    1. Jay Cole says:

      Yet didn’t. You can see in the first video the ‘driver’ is just hanging out with the passengers giving a speaking tour while the vehicle is in operation…he isn’t in the front seat actively monitoring the road and driving as would a full time operator. Would kinda like being in a Model X in the back seat having a conversation with a passenger, not exactly ‘present’ in the driving environment.

      This is kinda the fundamental flaw/debate with any high functioning autonomous driving vehicle that has a disclaimer/requirement for a human drive to be present/interact with the system. Over time, one gets used to the autonomy doing its job, which builds an expectation of operation…until it doesn’t. At which point your normal ‘human reactions/awareness’ is not active, or is lessened to a degree (or several).

      As a result, we see a lot of incidents where the autonomous vehicle isn’t at “fault” legally, but it isn’t as qualified to drive competently, as compared to a human driver, in these “aw, crap” situations.

      So, the debate is (as always), should ‘high functioning’ autonomous vehicles that can do “99.9% of the driving” not be certified for road use until they are “100%” functional and are approved to be driver-less…because the human intervention success factor in those .1% of times really isn’t so great. Does the ‘better’ driving in the other 99.9% situation negate the .1% of ‘moron’ driving?

      It’s not that advanced autonomous vehicles should not be used, or that they are not the future…but should they be used by the general public before they are 100% certified/capable of at least duplicating and reacting to normal/expected human operation – because when they hit the road, that is the environment they are entering.

      In other words, is the ‘text disclaimer’ asking a driver to not be a human being (and stay focused-up) when using 99.9% autonomy adequate? Does having an unengaged ‘human nanny’ present really alleviate those .1% issues where the AI needs to ‘think outside the box’ to avoid an accident or harm to others?

      I don’t have the answer, nor am qualified to make that kind of determination. But personally, I’d like my autonomous vehicle to have the skills (and competency) of Jason Statham in the Transporter in its back-pocket if it needs it – that is ultimately what we are going for here. Way more convenience? Sure. But way more safety too…and not necessarily in that order.

      I want to sit in the car and be impressed, as in “wow, this thing drives way better, more intuitively than me” when it encounters a complex/active situation; not thinking “well, that was kinda weird/slow/dangerous” when an unusual situation pops up. So far, I’ve only experienced the latter.

      1. DougB says:

        Do these vehicles have to pass a test? Would seem that such systems should have some state mandated on the road test outside of the proving grounds and on unfamiliar roads.

        1. God/Bacardi says:

          Stated they tested the tech for two weeks on closed roads this past Jan…

          https://www.theverge.com/2017/1/11/14244732/las-vegas-navya-autonomous-self-driving-shuttles-test

        2. Pushmi-Pullyu says:

          Government mandated pass/fail tests of a system under active development is pointless. By the time the test is developed and implemented, the system they’re testing will be far different than the one the test was designed for.

          The kind of testing you’re describing always follows a technological development; it never precedes it.

      2. pjwood1 says:

        Jay, I completely get your point, but you say “99.9%” a number of times, when this vehicle couldn’t even make it through its first day.

        Maybe somebody else will start “InsideAVs”. Like, some kind of AI-bot.

      3. ffbj says:

        That’s why the shut down the Google car because drivers were falling asleep.

        I think they will now put a proximity alert for the van so it beeps/honks if it detects another vehicle about to hit it.
        It detected the truck and stopped and should honk just after it detects that stopping is not going to matter as the approaching vehicle will still hit it.

      4. Doggydogworld says:

        Jay wrote: “Over time, one gets used to the autonomy doing its job, which builds an expectation of operation…..”

        Over very little time, typically a couple of weeks. Google was shocked at how quickly their people started trusting their early cars, e.g. reaching into the back seat for stuff while the car was driving itself. Same with Ford, Volvo and a few others.

        Speaking of big “first days” for Autonomous EVs, Jay, why no article about Waymo’s huge milestone? Their Pacifica PHEVs are driving all over Chandler with no one in the driver’s seat.

        We get articles about minor Tesla tweaks to Autopilot 2.whatever but complete radio silence when Waymo makes history.

    2. Brave Lil' Toaster says:

      I don’t think anyone got the point.

      The driver present was the one driving the semi. In my jurisdiction, this means that as a professional driver, it’s *automatically* his fault in most cases, even if the other driver did something stupid.

  2. Tim Miser says:

    No horn? Isn’t a horn required in all vehicles?

    1. Kdawg says:

      Sounds like it needs an autonomous horn.

      1. DJ says:

        One that plays la cucaracha! I don’t see what the big deal is here. The truck was at fault and was the one cited. Unrelated but if Vegas built the monorail to the airport there wouldn’t be a need really for as many cabs and vehicles like this!

        1. Kdawg says:

          I think a monorail to the airport would mean more shuttles would be needed. Visitors would not have rental cars, so they would have to take these shuttles to get around town. Currently, this autonomous shuttle only picks up passengers along a 0.6 mile, three-stop route, up and down the Fremont East Innovation District.

    2. TomThumb says:

      No. Only headlights, indicators, and tail lights are required for a road vehicle.

  3. Will says:

    Dosent have a driver. Its like a rubber tire trams but driverless. No autonomy for me. Driver assist yes, but no driverless cars. The car might get into apple maos and drive you off a cliff

  4. Kdawg says:

    I’ll say it; if the truck had also been an autonomous vehicle, this accident wouldn’t have happened.

    1. Nick says:

      Would they have been stuck, waiting for the obstruction to clear?

      1. Kdawg says:

        What obstruction, the truck?

        The truck would not have backed up in the first place, as it would have seen the shuttle. The shuttle could have just gone on its merry way.

      2. Nebula1701 says:

        This is where V2V would be helpful… oh wait…

        1. fotomoto says:

          And that is what it will ultimately take to make autonomous vehicles truly workable (think the background traffic in any star wars movie or other sci-fi film).

    2. pjwood1 says:

      Give that man a job, at MobilEye 😉

      That’s the thesis. Autonomous only works if we ban human drivers, and make people pay for V2V technology in every car:
      https://www.eetimes.com/author.asp?doc_id=1332471

      1. Kdawg says:

        Even “pilot assist” or some simple safety features would have prevented that truck from backing into something.

        Sorry, but humans just suck at driving.

    3. ffbj says:

      Probably, but backing up a semi is not backing up a car, much more complex. Can they even do that yet? Autonomous trucks.
      I have not seen an example of that, it’s probably the most complex of the tasks that would be required of an autonomous truck.

      1. Kdawg says:

        Backing up a semi truck is just math/geometry. I don’t think it would be much of a task for a computer. Automated parallel parking comes to mind, though there’s a *twist* when it comes to a trailer. 🙂

        1. ffbj says:

          I suppose we will need some semi’s first to see, though like Jay said of auto pilot, it will probably be laboriously slow.

          1. Kdawg says:

            Maybe this month w/the Tesla semi. Though I don’t think they mentioned autonomy.

        2. Loboc says:

          Ford has TV commercials where their trucks back up trailers using some kind of joystick controller.

  5. ffbj says:

    I saw a semi back into a lady in a little Honda she was beeping like crazy, cheap little horn, the driver could not hear her. I told them he was totally at fault, when they called me to report.

    Then the opposite extreme is the guy who put a train horn on his truck.

  6. JBA says:

    Considering the abilities of many of the drivers on our streets today who are more interested in their cell phones than driving skills, I would put the probability of a driver reacting to prevent the collision more at 1% than 99%.

  7. SCOTT says:

    I can’t wait until self-driving everything is mandatory. People are incredibly stupid, and driving is terrifying because of this simple fact. Driving is statistically the most dangerous thing most people do.

  8. Mikael says:

    First video, anyone noticed it didn’t let the pedestrian cross although she had priority (and a stroller…). 0’25 to 0’29.
    That’s quite bad

    1. (⌐■_■) Trollnonymous says:

      Good catch!
      That’s worthy of a Ticket.

      Didn’t the one the truck ran into have any evasive maneuvers or at least lay on the horn????
      So it just sat there like a stunned fainting goat in the known dangerous position and let itself get hit?

    2. SteveSeattle says:

      Yes. She had to pause because it did not yield.
      Improvement needed.

    3. fotomoto says:

      “Good catch!
      That’s worthy of a Ticket.”

      Folks didn’t catch that she’s the violator. See the blinking red “don’t walk” sign? She wasn’t in danger so the bus continued on its journey.

      If these vehicles were to pause, yield, give way, in all of these safe scenarios, then humans will quickly ignore them (call their bluff) and walk right in front of them all the time “cuz ya’ know they’ll stop”. I’ve read reports of drivers doing the same like taking the right away from an AV at a 4way stop.

  9. Mark C says:

    I’d say the autonomous vehicle wasn’t programmed to allow enough distance between where it needs to stop and the clearance a truck would have to have to back into an alley or loading dock. The AV was not programmed to look at the dimension of the intersecting roadway to calculate where it needed to stop.

    So, we now have a truck driver with a moving violation for hitting a vehicle that had no driver. He goes down in history today.

    1. Pushmi-Pullyu says:

      I’d say the autonomous driving system was not designed to take active measures to maneuver to avoid an accident.

      Speaking as a programmer, that’s perfectly understandable. A program to avoid maneuvering the vehicle into a collision course isn’t all that difficult; many computer games these days perform that routinely.

      But programming the car to drive out of its way (in this case, backing up) to avoid an accident is a much more difficult programming challenge, by orders of magnitude. You are now calling on the vehicle to react as though it was making a judgement about what actions will most likely lead to the safest outcome, which means there will be a risk that the vehicle will actually be the cause of an accident.

      Any active measures to avoid collision must be things which computer programmers have planned for in advance, and programmed into the autonomous driving system. I expect autonomous driving systems to improve to the point that they won’t commonly have accidents under the conditions described in this article. If a moving obstacle approaches on an intersect course and there is sufficient room to back up, then the vehicle should back up to maintain a safe distance. That behavior shouldn’t be that hard to program, if the vehicle is already capable of self-driving.

      But there will always be rare cases where the autonomous vehicle won’t respond like a human driver (or even an animal on the road trying to avoid being run over by a car), because the car does not have any “situational awareness”… not even the situational awareness of an insect. The autonomous vehicle will do what it’s programmed to do and/or what the laws of physics (and inertia) dictate for a moving object… and nothing more.

  10. Pushmi-Pullyu says:

    I guess I’m far more optimistic than Jay. We can’t expect autonomous vehicles to react better than a human in 100% of situations. That’s an impossible goal, and requiring that level of performance would be about the worst possible case of “The perfect driving out the good.” For example, Tesla Autopilot + AutoSteer is already saving lives, today, despite the fact that it’s far from perfected.

    To delay having autonomous or even semi-autonomous vehicles saving lives just because there are some rare “edge cases” where they don’t react as safely as human drivers, would be as foolish as telling everyone to remove the air bags from their cars because, rarely, one of them explodes.

    “The thing to keep in mind is that self-driving cars don’t have to be perfect to change the world. They just have to be better than human beings.” — Deepak Ahuja, CFO of Tesla Inc.

    1. EndResult says:

      This accident occurred on day-1 – how is that an edge event?

      1. fotomoto says:

        Because it’s not an “accident”; rather two vehicles barely touched.

        Anyone care to do a comparison ratio to the number of accidents humans have on their first day of driving? 🙂

        1. fotomoto says:

          Check out what a human in a Lexus can do. Now THIS is some driving skillz!!!!

  11. kees says:

    https://www.zelfrijdendvervoer.nl/autopilot/2016/10/03/zelfrijdende-parkshuttle-op-rivium-is-verborgen-pareltje/
    this is already working for years in the netherlands .riviumpark kapelle a/d ijssel .
    290.000 passengers per year in 2016

  12. kees says:

    https://www.2getthere.eu/successful-climate-test/
    is same company which built parkshuttle capelle a/d ijssel

  13. EndResult says:

    When a semi is backing up there is a huge blind spot due to the angle of the cab to the trailer. A driver in car heading towards the maneuvering truck would have recognized that the truck was preparing for (or in the process of) a back-up maneuver and would have stopped significantly further back (assuming that they were courteous which in its self is a rarity today). I disagree that this was the truck drivers fault. This incident illustrates how primitive the autonomous technology/programing is today.

    1. nom de plume says:

      You might want to consider this eyewitness account before letting the truck driver completely off the hook:
      https://www.digitaltrends.com/cars/self-driving-bus-crash-vegas-account/

      As a person who’s driven a lot of miles in semis, I’m with Nix (below) on this one.

  14. Nix says:

    I’ve seen this before, where a truck driver operates on the theory that he’s the biggest thing on the road, so people should stay out of his way. It looks like he was expecting other drivers on the road to yield to him even when lawfully they don’t have to.

    I’ve seen this play out over and over. Trucks blocking multiple lanes of traffic, because the company they are dropping off at don’t have proper ingress/egress for a vehicle that size. Cars having to stop or change lanes even though they have the right of way. Cars having to reverse from a stop sign because a truck can’t clear a turn without going into their lane of traffic. Etc.

    I’m not trying to be harsh on the truck driver. Often city streets just were never designed to accommodate this type of truck. The answer for years really has been for car drivers to steer clear of trucks trying to make these maneuvers out of understanding that the world doesn’t always operate on 100% strict adherence to the letter of every law 100% of the time. And sometimes you have to yield your right of way to others, even when they don’t have the right of way in order to accommodate others for altruistic reasons. That is going to be hard to program.

  15. Kosh says:

    at 00:36, was that Paul Teuttle (sp?) Sr. from American Chopper that rides through?????

  16. Steven says:

    In theory, there is no difference between theory and practice.
    But, in practice, there is.

Leave a Reply