Check Out Crystal Clear View Of Tesla Autopilot Line Detection: Video


As Tesla moves toward full self-driving, we see its Autopilot tech can discern stop lines.

Tesla constantly updates its technologies via over-the-air software updates. The company has said on numerous occasions that it will incrementally update Autopilot until it eventually becomes ready for Full Self-Driving.

Just recently, during a podcast with ARK invest, CEO Elon Musk shared that by the end of 2019 full-self-driving optioned Tesla vehicles will be feature complete. However, he also said he thinks the vehicles will be able to self-drive without human intervention by the end of 2020.

We’ve shared videos before via Tesla hacker and YouTuber greentheonly. Essentially, he’s able to get inside Tesla’s computer system and show us what Autopilot actually “sees.” Now, in his latest video, he reveals that Tesla added stop line detection. He notes that the front camera seems to “see” the stop line, as the process does not appear to be related to any GPS mapping information. In addition, he points out that it doesn’t seem to calculate distance at this point.

The hacker also shares that Tesla Autopilot cameras and software are now capable of identifying which type of vehicles are in the area, as well as each vehicle’s class.

The fact that a recent update to Autopilot makes the technology this much more aware is a testament to the automaker’s eventual move to full self-driving technology.

Check out the additional video below and let us know your thoughts in the comment section.

Video Description via greentheonly on YouTube (above):

Tesla autopilot detecting stoplines during day

Only main camera since this is where all the action is anyway

The video below shows the same type of footage in a night-time setting:

Video Description via greentheonly on YouTube:

Tesla autopilot detecting stoplines during night

Only main camera since this is where all the action is anyway

Categories: Tesla

Tags: , ,

Leave a Reply

12 Comments on "Check Out Crystal Clear View Of Tesla Autopilot Line Detection: Video"

newest oldest most voted

I can’t even detect stop lines in my area this time of year. The garbage they paint the street with lasts about 3 weeks and they do it once a year. Half the lines are not even visible covered with snow, ice, grime. I wonder how they cars will handle this? I suppose eventually they will do a better job then humans, but in the meantime we will need cars like Tesla that still easily drive both ways. 🙂

There are many residential intersections with no stop or yield signs or pavement markings at all. In these cases, the car would have to come to near stop regardless of sign or pavement marking to make sure it is safe to cross or turn. I am guessing one of the AI learning areas is learning to navigate difficult intersections. With proper software and training, I don’t see this as a problem but I think when they implement it, people will be frustrated how slow these cars operate.

The quality of road markings (and maintenance) seems be critical to the autonomous operation, yet it it doesn’t received much attention or discussion. There are plenty of terrible roads out there, which are never seen in autonomous driving examples.

Can’t help but think roads will eventually have to be divided into at least two categories, approved for autonomous driving based on quality and maintenance, and unapproved, which means the car will default into assist or manual mode. That also implies some entity has to make that decision.

I find that autopilot generally does a better job of detecting lane lines than I do.

The point autopilot has issues is when the lane lines simply aren’t there – it doesn’t seem to be programmed to simply follow the road and pretend the lane lines are there when they’re entirely missing. It’s missing the common sense of “this road is ~16 feet wide and unmarked, so I should keep to the right half of the road to avoid oncoming traffic.”

Easy enough to add later on. Just a matter of making it a priority.

I wonder how much this system can rely on previously learned lane and stop line positions correlated with GPS data. I could see this helping quite a bit when the road is covered in snow, but of course it would also have to be able to work when no data is available beyond the camera input.

I love how it classifies all the SUVs as Minivans. 🙂

It even classifies trucks as Minivans (0:31), but that was because the truck was dark and the background was dark.

NTSB is sending a team of three to conduct a safety investigation, while NHTSA is sending a field team.

“Some Tesla drivers say they are able to avoid putting their hands on the wheel for extended periods when using Autopilot, while Tesla advises drivers that they must keep their hands on the steering wheel and pay attention while using Autopilot.” ———————————————- It would be interesting to poll the Tesla drivers here and see if this statement is true. ———————————————- Quote is from this article:

I have found that to be true only when the car is going very slowly, as in stop-and-go traffic.

Those are heavily edit videos. Are all the stop at lights always following a vehicle in the front? are there any clean stop at the light for any of those video? How about a complete stop at red light before a right turn?

Human is driving. The videos show how AP interprets the scene.