Skip to main content

'All This Just For a Waymo To Drive Better': Man Says Tesla Struggles When the 'Sun Is Low.' Then He Proves It

'That's HW3 for ya.'

Tesla struggles in low sun
Photo by: @supvrron/TikTok

Tesla’s Full Self-Driving system is powered by some of the world’s most advanced AI.
So why does it still stumble when the sun hits just right? A viral video has reignited a troubling question: Can Tesla Vision handle the real-world messiness of sunlight, glare, and imperfect roads?

In a clip from EV enthusiast creator Arron (@supvrron), we see inside a Tesla in self-driving mode on a sunny day, only for the system to fail and the driver forced to take control of the vehicle. While there’s no apparent safety issue in the video, it still raises concerns for Tesla drivers and riders.

What HW‑3 and Tesla Vision Entail


The clip notes that the featured car uses Hardware 3 technology, which is becoming somewhat outdated among Tesla’s offerings.

Hardware 3, installed in vehicles since 2019, powers its Full Self‑Driving (FSD) system using a custom FSD computer chip designed for high‑speed neural network inference. In 2021, Tesla fully transitioned to its “Tesla Vision” strategy, dropping radar and ultrasonic sensors in favor of a strictly camera‑based system—even in adverse weather conditions. That leaves FSD reliant solely on visual feeds, with no redundancy to handle poor visibility scenarios.

Tesla’s official owner’s manual explicitly warns drivers: bright light, such as direct sunlight, can interfere with camera vision and may degrade the system’s ability to detect objects or lane markings. Despite advanced AI under the hood, Tesla vehicles when operating in FSD mode remain classified as SAE Level 2 automation, meaning driver supervision is mandatory at all times

Sunlight‑Related Failures

Beyond this TikTok clip, Tesla owners have reported FSD failures attributed to glare and sunlight. On Reddit, one owner described FSD misinterpreting a yellow traffic light when sunlight glare hit the signal directly, a mistake that could significantly increase risk.

YouTube walkthroughs and troubleshooting videos also reference scenarios where direct sunlight on B‑pillar cameras triggers error messages like “check camera” even with lenses clean, indicating the algorithm struggles with washed‑out or high‑contrast frames.

In October, the National Highway Traffic Safety Administration (NHTSA) opened a preliminary evaluation into Tesla’s FSD system across roughly 2.4 million vehicles, focusing on incidents tied to sun glare, fog, and airborne dust. Among the reported collisions was a fatal pedestrian crash in November 2023 near Rimrock, Arizona, where sunlight was explicitly cited as a contributing factor.


Tell us what you think!

This probe marks a shift in focus: Regulators are evaluating whether Tesla’s camera‑only approach is inherently insufficient in reduced visibility, and whether software updates have adequately mitigated those risks. The investigation is especially notable given Tesla’s removal of radar in favor of vision-only systems, while many competitors maintain sensor fusion with lidar or radar backup for resilience in edge cases.

How Sunlight Disrupts FSD

Tesla’s Full Self-Driving system, built on the camera-only Tesla Vision platform, can struggle in direct sunlight, particularly in vehicles equipped with the older Hardware 3 platform. Glare can wash out images from forward-facing cameras, making it difficult to accurately interpret traffic signals, lane markings, or vehicles ahead. 

These interruptions often stem from the limitations of Tesla’s neural networks, which may falter when confronted with high-contrast lighting conditions not well represented in training data. Factors like smudged lenses, windshield glare, or early morning dew can compound the problem, causing the system to either misclassify its environment or err on the side of caution by handing control back to the driver.

Without a redundant sensor suite like those used in other autonomous platforms that rely on radar, lidar, or sensor fusion, Tesla’s system lacks an alternative method to navigate when vision falters. A momentary camera failure can result in immediate system disengagement, raising questions about how ready the technology is for true autonomy.

Tesla began deploying Hardware 4 in 2023, with improved processing power and upgraded cameras, but these vehicles still operate on software models trained for HW3. In parallel, regulatory scrutiny is increasing. In August 2025, a Florida jury awarded $243 million in a wrongful death case involving Autopilot, highlighting the stakes of Tesla’s vision-only strategy as it pushes toward a future of robotaxis.

Inside EVs reached out to Arron via direct message for comment. We’ll update the story if we hear back.

 
Get the best news, reviews, columns, and more delivered straight to your inbox.
For more info, read our Privacy Policy & Terms of Use.

 

 

Got a tip for us? Email: tips@insideevs.com