How Does Tesla Autopilot Fare Driving Directly Into The Sun? Video


Tesla Autopilot cameras have pros and cons when it comes to direct sunlight.

Tesla Autopilot still has plenty or quirks. This is precisely why Tesla didn’t attempt to launch its upcoming full self-driving technology prematurely. With each update, we see plenty of positives. However, there are also issues that the electric automaker must resolve.

Being that the Autopilot system only relies on cameras and radar, there have been issues with regard to different levels of light and objects that blend into others, etc. The facts continue to show that while the feature may be the best on the market to date, it’s far from perfect.

Tesla owner and YouTuber Scott Kubo has a habit of providing Autopilot update videos. In fact, it’s just about all he focuses on. He’s uploaded a handful of other content, but Autopilot is clearly his thing.

In Kubo’s latest share, he addresses issues related to Autopilot dealing with direct sunlight. Fortunately, he is impressed with the car’s lane-keeping feature. However, he does note that automatic lane changes caused some problems.

While the auto lane change feature is a nice bonus, at least it’s something that may not be expected at this point. If you try and the car fails, you simply take over and change lanes yourself. Being that Tesla Autopilot is still a “hands-on” system, the expectation is that users are aware and in control at all times. This means, if a feature isn’t initiating properly due to some circumstance, it’s your job to handle it.

We’re confident that Tesla is aware of such issues and will continue to make necessary updates and adjustments for improvement.

What are your issues with Autopilot. Let us know in the comment section below.

Video Description via Scott Kubo and YouTube:

Tesla Autopilot Driving Directly Into the Sun

How well does Tesla’s camera-based autopilot perform when driving directly into the sun? Lane-keeping was pretty good, but auto-lane changes were problematic.

This was tested on v9.2018.48 and v9 2019.5.6

Categories: Tesla, Videos

Tags: , ,

Leave a Reply

21 Comments on "How Does Tesla Autopilot Fare Driving Directly Into The Sun? Video"

newest oldest most voted

And yet FSD is supposedly ready by the end of the year?

Feature complete and ready are two different things. I think the basic capability to do everything will be in place by the end of the year (stop for stop signs, lights, pedestrians, navigate streets, etc), but you are still going to have to be there to monitor it for another 1 year according to Musk, my guess is more like 2 or 3 years. Maybe longer in areas with poor markings, snow, and such. On my Honda this time of year the radar and cameras tend to be hopelessly covered with road salt grime in about 10 minutes of driving when it is really wet/slushy out.

Feature-complete is a meaningless concept if it’s just a randomly selected set of “things it can do, in some ideal circumstances”. In any case, Elon’s indication that the cars will basically be level 5 by the end of next year is precisely the sort of hype that’s been coming from that direction since he started talking about it! When was the first time a Model S was supposed to drive, with zero intervention, from coast to coast? I think it was 2016, but it might have been earlier. Then it would happen in 2017. And then in 2018. So far we haven’t heard about it for 2019, but it sure seems reasonable to expect it’ll happen by the time autopilot is “feature complete”! I am all for developing autonomous cars, and I believe they will completely transform transportation. But I also know a bit about computers, machine learning, and AI. It’s one thing to have the car recognize some of the nearest objects, but quite another to have it generally, reliably, tackle any situation that may arise at least as well as an average human driver, even on roads with poor or no markings, in the rain and snow, and… Read more »

One thing to remember the other ice manure factures are ten (+/-) years behind Tesla on most things so why not autopilot tech as well??
Also Elon is personally testing the next level versions of auto pilot so he is much more of a reliable source than a lot of other “experts” for they have no clue what Tesla’s new AI chip is capable of and he does and are actively testing it’s capabilities ,I would love to be a fly on the wall of those engineering discussions.

Elon is great at selling a feature that isn’t complete or ready ..

Suckers are buying 🙂

I love using Tesla’s Level 2 features. I do treat them as Level 2 acknowledging that I am responsible at all times. One day we will see FSD, but for the moment they are just driver assist. They are well integrated into the driver experience and I would not give them up for anything.

“set the controls for the heart of the sun”

Not only direct sunlight can blind the camera, try to do the same but right after the rain, the reflection from the wet road and the direct sunlight can blind the camera big time. This had been an issue for years and I can’t see how Tesla can handle this extreme condition for fsd without additional devices that can measure thing better than just cameras.

I agree. Maybe it’s possible, or will be one day, to make a computer drive better than humans using only the same senses we have. But it just doesn’t seem possible that it wouldn’t drive better still if it took advantage of a few extra senses in addition. Lidar seems like a good idea, especially since it gives accurate depth information, something that’s very useful for figuring out how things are moving (and thus predict their paths) and also very hard to do accurately using just cameras. Buy even if it’s not lidar, *something* surely should supplement cameras. Btw I can’t recall having ever heard of anyone using microphones at all. Which is odd, since we definitely know hearing is useful for human drivers, albeit much less so than seeing. I think cars will get steadily more capable of assisting with driving, but not get to level 4 for many years still (where a driver must be in the car, but the car can be relied upon to know its own limitations and left to drive; it will stop and ask for help if a situation arises that it can’t deal with). Level 5, where the car can drive the… Read more »

> very hard to do accurately using just cameras.

But you could have camera’s that are extremely high resolution, very high frame rate, far greater dynamic range (key!), and/or multiple camera’s (visible light, infrared, ultraviolet, high contrast, etc) that would be superior to any human eye, and computer software that can (how far in the future?) interpret data far better than any human brain, to get a system that can far out perform a human being, and not have to use a laser based system (lidar).

Lidar is great for 3d objects in the environment, but can’t see road markings, so better camera’s are needed.

I think 20 years from now, we will looks back at the current AP, and laugh at how primitive it’s camera’s and computer were.

The camera’s going into the cell phone industry is driving innovation at a remarkable rate.

“Lidar is great for 3d objects in the environment, but can’t see road markings, so better camera’s are needed.” Real-time mapping of the environment immediately around the car, coupled with a SLAM 3D mapping system, would enable the car to be able to “know” its exact position in relation to stored mapping data, which would include lane positions, speed limit zones, etc. In fact, with a proper lidar scanning system, a proper SLAM, and proper 3D maps, the car would only need cameras to read traffic lights and to watch for any variances from the stored maps, such as road construction and detours. Furthermore, we can easily envision that road crews might set out markers specifically designed as lidar reflectors, to alert self-driving cars that they should ignore stored navigational data, and follow the path of reflectors set out in a construction zone. We don’t at all need better cameras. It would be a foolish waste of money to equip cars with both visible light and infrared light cameras, which would require twice as much development of visual image processing, since things look far different in infrared light vs. visible light. Current cameras are perfectly adequate to read traffic signs… Read more »

Yup. This is one of the reasons why Waymo’s self-driving cars use cameras, lidar, and radar. Trying to rely on on just one type of sensor isn’t the best strategy.

But the problem of “seeing” at night is an even bigger reason why the primary sensors should be either lidar or radar, not visible light cameras. It astonishes me that Elon Musk is being so adamant about trying to rely on cameras for almost everything. It’s just not going to work; it’s not physically possible. This is holding back Tesla’s progress toward Level 4/5 autonomy.

I could be wrong, but I think that it takes more concentration to supervise auto pilot than to drive the car yourself. Especially if auto pilot call your attention to a fault on its part with split second reaction time in a dangerous situation.

Not the case if you have both hands on the wheel. It is so sensitive to a nudge that it gives control back to you. In fact, many complain that it returns control to you too easily. That is by design and should be that way. It really is no different than the command you should wield over cruise control for many years now. The difference is our willingness to wield control. It is still assist. Damn fine assist, but still assist driving.

Sorry hit the wrong thumb button

I felt like I was wrestling back control of the steering wheel on an AP1 Model S that a rented. Is AP2 different? This was just after the 8.something nagging update last March.

I completely agree! I don’t think I’d be able to “relax” — I’d have to monitor everything I otherwise must, but in addition have a less predictable car to contend with! Looking at that lane change attempt, and the way it pretty rapidly swerves back when it suddenly decided it doesn’t want to change lanes after all, only reinforces my conviction. No way that’s easier than changing lanes yourself! The only effort involved is with monitoring the other traffic around you to have a good overview, and he must still do that. And if you watch closely, the last time it happened in the video, another car was closing in on his left as the car decided to swerve back. Although he caught it with no drama, I don’t think he intended to move as much to the left again as he had already done when he was again in control. I’m kind of glad people are shelling out for this, because Tesla would have a very difficult time turning a profit if everyone were like me and you, and didn’t want to pay top dollar for these features. At least as long as they continue to pursue this strategy… Read more »

Hi Pet, You should try it 🙂 It is waaaaay easier with AP than without.

In Navigate with Autopilot, you’re supposed to confirm suggested lane change by activating the turn signals, not the gear selector. See p 77 of the Model 3 Owner’s Manual. Seems to me like operator error rather than a problem with Autopilot.

Yup… sigh, people would rather spend their countless hours filming and editing themselves misunderstanding how things work, versus spending a few minutes with the owner’s manual to learn how things actually work.

How strange for this article to single out Tesla Autopilot.

A much more interesting and informative article could be written on the performance of all types of sensors used for driver assist in cars, not just video cameras, when “looking” directly into the sun. As I understand it, lidar also has significant problems looking into the sun. I don’t know about radar; perhaps this is one area where radar is superior?