Tesla Releases Self-Driving Demonstration With Recognition Feed – Video

NOV 20 2016 BY ERIC LOVEDAY 35

Tesla Model X Self-Driving Video

Tesla Model X Self-Driving Video

Tesla previously revealed it was adding self-driving hardware to all new vehicles. At that time, the automaker released a short video showing the capabilities of the system.

Now, Tesla has released a longer demonstration of the tech (see above).

Almost immediately after the long version came up, an adept YouTuber slowed down the video to “real-speed” so that we can more accurately see what’s going on (real-speed video below).

Check out both videos (the real-speed version shows us that there are definitely still some quirks in the system…let us know in Comments how many you spot) to see self-driving in action. Note that the driver is there for legal reasons only.

Oh…there’s this too if you’d rather have a different background audio track.

Source: Tesla

Categories: Tesla, Videos

Tags: , , , , , , ,

Leave a Reply

35 Comments on "Tesla Releases Self-Driving Demonstration With Recognition Feed – Video"

newest oldest most voted

So does this mean “look ma, no hands!” is now officially sanctioned by Tesla?

What it means is it’s learning and it will get better with time.
No point in drawing any conclusions just yet.

The driver in the video didn’t have his hands on the wheel throughout the video. What conclusion do you draw from that?

This certainly makes it hopeful that I can take a nap in my commute by the time I get Tesla 3. Sooner the better since there will be more beta testers!

In the most recent official Tesla Autopilot video before this one, the driver was seen with his hands open, underneath the wheel, not holding on but in a position to grab it instantly if necessary.

I do find it notable that the (non-)driver’s hands in this video are flat in his lap. That certainly suggests Tesla’s level of confidence in the reliability of autonomous driving has increased, at least under the carefully selected conditions of this demonstration.

So yeah, it does look like “Look Ma, no hands!” will soon be officially sanctioned by Tesla. But then, you can say the same about an actual airplane’s autopilot; it’s hands-off once activated. However, I think the official line from Tesla will still be that you need to keep your eyes on the road, in the way that pilots using an autopilot are admonished to “Keep watching the skies!”*

*Fans of ’50’s science fiction movies will no doubt recognize the reference. 🙂

Hi Pushi,

I agree that the way the non-driver holds his hands suggests a high degree of confidence in the system.

Furthermore I really enjoyed to see that the software seems to set to be more defensive than most human drivers would drive in the presence of pedestrians on the level sidewalk.

I hope that this feature will not vanish once the software engineers get more confident with the detection rate of the sensor system. Safety first has to be the rule.

Hopefully the system will classify children as a subgroup of pedestrians/cyclists as they obey a different behaviour than adults.

On the contrary, I hope that autonomous vehicles will use simple collision avoidance routines to avoid accidents, rather than trying to predict the erratic behavior of children, animals, and adults using their smartphone while walking*. Attempting to predict that sort of behavior will inevitably fail a certain percentage of the time. Observing actual motion by moving obstacles in real-time, and reacting to those movements in real-time where necessary, is what autonomous cars should be doing. Recognizing a class of moving obstacles labeled “children”, as distinct from other moving obstacles, would get into the sort of programming nightmare which would make developing reliable software for autonomous driving impossible, or nearly so. In fact, reliably recognizing a “human” would be nearly impossible. Can the software recognize a human inside a vehicle? Or an amputee in a wheelchair? How about someone wearing a costume? Would an autonomous car recognize the fact that that mascot wearing an enormous fake head as part of a costume was actually human? If you want it to identify “children”, can it distinguish between a normal child and an adult midget? Best to avoid all this. KISS: Keep It Simple, Stupid. *Or perhaps I should say: Animals, including children… Read more »

Meanwhile at a major automaker:

“What the hell! Have u guys seen this Tesla self driving demo video? Fronz, you told me last month there is no fu**en way Tesla can pull this off within next 3 years….look here…the fu**ers doing it now and shoving this video in our face just for the fun of it….what the fu**! …Explain to me again why we can’t today also be doing this…”

You missed about 5+ meetings of top-mid management 🙂

Not much drama there… upper management quickly talked themselves into agreeing the video is not a “real world” demo…that it’s just standard Tesla marketing hype & Elon over reaching what can be done.

It’s the smart front-line dev & engineer guys that are starting to get spooked. They updating their LinkedIn in hopes of getting picked up by Tesla…that’s what the video is really about…a recruitment tool for Tesla.

is not a “real world” demo….
and in case it is, by the time Tesla has autonomous driving on the market, those managers have already sacked their boni, there is no way they can lose money.

The first artificial sentient being, will likely be a car…

Thanks Elon! 😉

Pretty impressive. But this is daytime, good weather, and the roads markings are all well painted. Also, the environment is quite uniform.

I’ll be convinced when this thing works in the snow, driving into New York City at night from the north, as I did just this past winter.

Heck even driving in the rain at night would be really impressive. How do the cameras fare when the glass is wet? How do they handle seeing reflected headlights on the road instead of lane markings?

Maybe this is ready for prime time, but this video doesn’t prove it at all.

Even Google’s team says their self-driving cars can’t handle a road covered with snow.

I don’t agree with those who say that self-driving cars will have to wait until they can handle all conditions of traffic and weather. I think we’ll start seeing other auto makers do what Tesla has done; start rolling out semi-autonomous driving controls that will work only under certain conditions, and will turn over control of the car to a human driver in other conditions.

I found it strange to read a report that Google’s self-driving cars can’t cross bridges! But according to the article linked below, that means only certain types of bridges. Surely they can handle a typical freeway overpass?

http://www.businessinsider.com/autonomous-cars-bridges-2016-8

That’s a Uber only problem.

No it’s not, unless my autonomous car can’t drive by it’s own all the time, that is only 30% helpfull.

Imagine i drank some beer in the afternoon/evening with colleges since my car can drive me home. Half an hour before we leave it starts to rain. –> My car can’t drive, i can’t drive.

What about sending my car to home (to get my wife to shopping) after it put me to work. Somewher on the way the car encounters a situation it can’t handle and safely stops on the side. How is a human supposed to overtake the car controll if there is no one inside. The car is standing in the middle of nowhere and i have to walk 8km to get it back? No very comfortable.

A car that can only handle some situations is only good for the long vacation travel, where there are all people inside and someone can overtake if needed.

For me a autonomous car is either autonomous where it can handle 99.997%* of a conditions or it can not be considered autonomous.

*Only snowstorms where the wheels get stuck because of 1m snowfall and simmilar hard scearios are excluded.

You have some good points there, but a self-driving car that can’t use certain areas of the roadway (like some bridges) would still be useful so long as it can find a path from A to B that doesn’t pass through those areas. With experience, you’ll know if the car can get home by itself (to pick up your wife) from where you are, or not. You may need to explore a route in advance, trying a drive with you in the driver’s seat to see if the car can handle it by itself. Extreme weather will be a harder problem to solve. I don’t see that rain should be an especially hard problem, so long as the car is using active scanning. Now, if it’s relying on optical object recognition, then rain at night will be a nightmare. (In fact, that situation was always stressful and sometimes dangerous for me when driving, as I have very poor depth perception.) But despite Tesla’s current refusal to move to a roof-mounted, rotating active scanner, I suspect they’ll move to that before long. But I think snow will continue to remain a problem. The car needs to be able to “see” the… Read more »

I think a 70% solution, or so, is fine. I’m not saying it’s a bad thing.

However, in terms of business models it changes nothing. You still need drivers (since weather conditions change). You still can’t send the car off by itself reliably.

In effect it’s like a greatly enhanced cruise control; a convenience. Possibly also a safety feature. But not a self-driving car in the true sense.

There are large sections of the USA, and many countries, where “Can’t drive when there is a blanket of snow on the roads” won’t be any real obstacle to selling and using self-driving cars. For example, most of California, much or most of Texas, and Florida will all be just fine. Snow happens so rarely in those places, and disappears so quickly when it does, that the brief periods when self-driving cars can’t move won’t be anything more than a minor inconvenience.

Plug-in EVs aren’t for everyone in this “early adopter” stage of the EV revolution. The same will be true in the early years of self-driving cars.

Good to see that Tesla is rapidly improving its self-driving tech.

But I note a large number of objects to the side of the road which were incorrectly, if briefly, identified as “in-path” when they were definitely off the path the car was following. I presume that shows the limitation of using video camera images and optical object recognition software, rather than more reliable active scanning using radar or lidar.

And of course, what Tesla is showing in this video is the best-case scenario. And even here, correct me if I’m wrong, the car never encounters a traffic light. Clearly it can recognize and obey stop lights, but perhaps traffic lights are something Tesla cars still can’t deal with. If that’s true, then Tesla still has a way to go before their semi-autonomous car tech is as good as what Google has developed.

There are traffic lights at 50 seconds and 1 minute in the longer video.

Oops! Right you are, darth. Thank you for the correction.

I wouldn’t call active scanning more reliable in general… All sensor systems have their pro and contra. I bet you are well informed enough to know the setbacks of lidar so I can spare me the details 😉

The of course well chosen situation in the video contain quite a lot of “nightmares” for the lidar-people.

Don’t get me wrong. My opinion is clear. We should put as many different types of sensors as viable from a tech and money standpoint. I guess (hope) the Tesla sensor suite will improve maybe every two or three years.

There is so much more to come…

If I wasn’t so afraid of robocalypse I would be really happy 😉

Heisiberghiht said: “I wouldn’t call active scanning more reliable in general… All sensor systems have their pro and contra. I bet you are well informed enough to know the setbacks of lidar so I can spare me the details ? ” Well, don’t spare me the details. Let’s just say that at least one of us isn’t as well-informed as you think we are. I can’t imagine any condition under which software trying to use proven-to-be-unreliable optical object recognition from video camera images would be as reliable, let alone more reliable, than analyzing reflections from a roof-mounted rotating active scanner. Even seeing the lane markings painted on the road should be more reliable if lidar is scanning those. Scanning lidar wouldn’t have confused the side of a semi trailer painted white with a “brightly lit sky”, as happened in the (so far as I know) one-and-only confirmed case of death in a Tesla car controlled by Autopilot/AutoSteer. I do know that lidar can briefly show false returns, similar to radar “hash”, but then video cameras can be blinded by glare, so I don’t see any advantage to video cameras over lidar there. You’re suggesting there may be some advantage to… Read more »

It’s important to distinguish between object recognition and obstacle recognition.

That’s true. The self-driving car doesn’t need to be able to recognize an obstacle; just avoid colliding with it. But just watch the slowed-down video above, and note the very great number of stationary objects which are well to the side of the road which are briefly highlighted with a green rectangle, which means the car “thinks” it’s an in-path obstacle. My point is that images from video cameras cannot be reliably used to detect obstacles which an autonomous car will need to avoid. Those paying attention to this issue should already have known this; that’s why the one-and-only confirmed fatality in a Tesla car controlled by Autopilot/AutoSteer confused the side of a semi trailer painted white with, in Tesla’s words, “a brightly lit sky”. Would active scanning by either lidar or radar have confused the two? Not likely! Now, that’s not to say that optical object recognition doesn’t have any place here. In recognizing road signs, I would guess that optical object recognition is what the car relies on. And in the case of Tesla cars, it would appear that the car is using video images to “see” the lane markings. I don’t know if Google’s self-driving cars also… Read more »

I love gadgets but I ENJOY driving a car. If I want to be driven somewhere, I’ll hire a cab or Uber. Frankly, this stuff scares the s**t out of me. Even if I wanted the autopilot feature to stretch or shake out muscles, why not just pull off the road, get out, and really stretch? Just because something is possible, doesn’t make it the right thing to do.

You enjoy being stuck in traffic and crawling along at 5 MPH and stop-and-go? Most people hate that. Driving on “interesting road” is great, but most driving for most people are boring tedium. If self driving cars have been available, I would’ve got my PhD just by using commute time!

Besides, you can still pull over the road to stretch. But with self driving cars, you can even take a nap in the car. I guess I still probably wouldn’t have got PhD since I’d be napping in the car instead of studying. 😉

Waiting said:

“Frankly, this stuff scares the s**t out of me… Just because something is possible, doesn’t make it the right thing to do.”

It’s human nature to find giving up control to be scary. Some reporters allowed to “drive” a Google self-driving car experienced the same thing; anxiety over giving up control. But at least some of them said it took only a few minutes to get used to the experience, and after that it was surprisingly relaxing.

So be aware your reaction has more to do with human psychology than with the technology involved.

I can asure you, most drivers on this planet (it’s Earth to make sure we are talking about the same issue) will love this feature.

Self driving isn’t fun when u have to deal with 1.2 bilion cars on the road worldwide.

Haters will embrace the technology very soon, just like they did with smart phone in the early years.

Good progress. Notable dislikes:

1) Stopped way before a stop sign.
2) Stopped because it falsely identified the jogging ladies as in-path objects.

If I was behind that slow Tesla, I’d be very pissed. Will it know to speed up a bit in the 35mph zone if the road ahead is totally clear and if there’s another car behind it getting real close?

Totally agree. The Driver behind the Tesla would’ve probably hit the car in the rear for this sudden and random stops. This is a no go.

Impressive demo, but the fine tune still needs to be done. Unfortunately the general agreement is that the first 99% of cases are easy to work out, but from then on it’s a case of diminishing returns. Don’t expect anything soon.

Stimpacker said:

“Notable dislikes:
[…]
2) Stopped because it falsely identified the jogging ladies as in-path objects.”

Wow. The car slowing when it detected pedestrians so close to the road that there was a real danger of them wandering into it, was absolutely what every responsible person would want a self-driving car to do.

People with your rather cavalier attitude toward safe driving shouldn’t be allowed to drive. So that is a pretty strong argument in favor of fully autonomous vehicles!

I saw quite a few instances where the car did not behave like a human driver:
1) The two incidents Stimpacker identified.
2) Vehicle did not seem to recognize traffic cones, what if one was laying in the path?
3) Dog walker with small white dog: dog was not recognized even though it was on edge of road.
4) Right turn where car stopped in middle of turn apparently recognized parked car as in-road obstacle.

I agree with Pushmi-pullyu that you want the car to be cautious, however the Tesla is at risk of being hit from behind by a human driver. No fun even though the human would be at fault. Also as a cyclist/jogger it seemed the car passed joggers/cyclist/dogwalker very close with chance the side mirror could clip them. A human driver (almost always) would edge over into the oncoming lane (if safe) to leave a safety margin.

I’m a big proponent of autonomous driving, however after watching that video I would not use the feature in the demo scenario, at the very least I’d always have my hands on the wheel.

Webdbbt said: “…as a cyclist/jogger it seemed the car passed joggers/cyclist/dogwalker very close with chance the side mirror could clip them. A human driver (almost always) would edge over into the oncoming lane (if safe) to leave a safety margin.” Yes, this is one area where I think Tesla AutoSteer needs significant improvement. It is quite noticeable, in this video as well as others showing Tesla cars under control of AutoSteer, that the car will never wander out of its lane, even when the other lanes are completely clear of traffic. In previous videos you can see what happens when another car suddenly veers into the Tesla car’s lane: The Tesla car steers closer to the edge of its own lane while slowing down. Never does the Tesla car change lanes, or veer out of its own lane, not even to avoid an accident. Yes, in the video above, it would have been better for the Tesla car to have veered to the left, out of its lane, to give a wide margin of safety to the pedestrians jogging at the roadside, instead of slowing down and almost stopping at one point. But then, in cases where there was an… Read more »

Well done Tesla! The AutoPilot system is getting better and better.

I noticed a few instances where the system did not behave like the ‘typical’ drive, but that’s a good thing, because it took safer, less risky options. Slowed for pedestrians, cautions when making turns, letting pedestrians cross the roadway instead of trying to get by them.

One thing to keep in mind when watching the video, is that everything is at a faster pace, so the stops and starts are not as quick/abrupt as they appear. And looking at the narrow, winding roadways the Tesla was traveling, lots of slow downs and stops could be the norm for a human driver as well.