Skip to main content

‘Take Over Immediately’: Man Updates His Tesla. Then He Puts It Into ‘Mad Max’ Mode

Part comedy, part cautionary tale

tesla mad max mode
Photo by: @saintzabi/TikTok

Tesla calls it Mad Max mode, but it’s the humans who end up sounding terrified. In a viral TikTok clip, a driver and his friend test the Autopilot feature, which is supposed to handle traffic “aggressively.” What follows is part comedy, part cautionary tale about just how confident people have become in letting their cars do the driving.

TikTokker Zeb Shefa (@saintzabi) goes from curious to crazy moments after he decides to try his Tesla’s Mad Max mode, which turns the car into the most aggressive, speed demon vehicle possible. Zooming around other motorists on the freeway, Zeb and his passenger friend are wondering if the car’s new option goes too far.

“Look at this [expletive] cut off at 95 miles an hour,” he said in the clip that’s been viewed more than 52,000 times. “I'm about to take this down a notch.”

What Tesla’s ‘Mad Max’ Actually Does

But what exactly is Mad Max mode? And what does it say about the state of advanced driver-assistance systems in the EV world? As Tesla quietly reintroduced this aggressive profile in its latest update, the carmaker has drawn fresh scrutiny from regulators and safety experts who wonder whether “fun upgrade” and “public risk” have overlapped.

According to reporting, Tesla’s new “Mad Max” driving profile replicates a more assertive lane-change and overtaking style than its existing “Standard” or “Hurry” modes, allowing tighter gaps, more frequent maneuvers, and higher relative speeds. A feature once buried in early Tesla Autopilot betas around 2018 has now re-emerged in the latest full-self-driving (FSD) software update.

For Zeb and his friend, the experience was half thrill ride, half wake-up call. In the clip, the car flashes its own warning, “Take over immediately,” just as the driver yells the exact same phrase in disbelief. That reveal underscores the reality: Even in “aggressive” mode, the system still expects human oversight.


Tell us what you think!

The regulatory backdrop is already evolving. The National Highway Traffic Safety Administration has opened multiple investigations into Tesla’s Autopilot and FSD systems. One current probe covers roughly 2.4 million vehicles after reports of collisions involving Tesla’s FSD software. Earlier investigations flagged hundreds of crashes, including at least 13 deadly ones, in which Tesla vehicles operating with Autopilot engaged and struck first-responder vehicles or came to abrupt ends in otherwise avoidable circumstances. Among the NHTSA’s concerns are whether Tesla’s systems properly ensure that drivers remain engaged and whether certain automated modes may invite misuse.

Against that regulatory backdrop, Mad Max mode raises new questions about marketing, perception and real-world deployment for a company whose brand is built on automation and “future” promises. The name itself signals high confidence and high aggression.

Reports suggest the mode retains the requirement that the driver keep hands on the wheel and eyes on the road but changes the tuning of the system’s lane-change logic to behave more assertively. That dynamic invites reflection: When the car is more “like a human commuter who’s had enough traffic,” does the driver relax instead of remaining vigilant?

Safety, Oversight, And The Human Handoff

Tesla drivers and enthusiasts dual-track the experience: part early-adopter experiment, part entertainment piece. In the TikTok clip, the passenger laughs, the driver shouts, and the car switches lanes rapidly. It’s a spectacle that’s equal parts tech demo and thrill ride. But for safety experts, it’s also emblematic of the “handoff” challenge in advanced driver assistance: A system that increases capability may paradoxically decrease vigilance. Academic research into Tesla’s Autopilot shows drivers may disengage automation when they sense unusual behavior, or conversely, they may over-trust it when it behaves well, leading to risk escalation.

Still, Tesla maintains in its vehicle safety report that when Autopilot is active, the collision rate is “even lower” than when it is not, citing global mileage data. The key phrase is “when active,” which leaves open the larger question of whether users are ready for the responsibility that remains while the system is automated.

For those watching the TikTok clip, the fun lies in the absurdity of watching a car seam traffic at 95 mph while the human occupants yell in panic. For EV-interested readers, the takeaway is more layered: This is a moment where marketing (“Mad Max”), capability (assertive lane changes) and human behavior (distraction, thrill-seeking, hands-off comfort) converge. The question is whether the framework around it, along with the need for supervision, warnings, driver training and safe deployment, holds up.

As Tesla continues to push its automation agenda and roll out new software versions, what happens on roads like Zeb’s highway clip may serve as a public test bed. The laughter and fear in the video are real, as are the technology and the stakes.

InsideEVs reached out to Zeb via email and direct message. We’ll be sure to update this if they respond.

 
Stay informed with our newsletter every weekday
For more info, read our Privacy Policy & Terms of Use.
Got a tip for us? Email: tips@insideevs.com