'Send Help': Woman Says Her Husband Talks To His Tesla Like It's A ‘Real Person.’ It’s Not Just Him
Modern technology, especially artificial intelligence, is reshaping the way people relate to machines.
The line between driver and car used to be made of metal and glass. Now it talks back.
In a viral TikTok clip, a Tesla owner chats with his car about home audio systems as if they’re old friends, and the car answers with polite, algorithmic confidence. It’s funny, but it’s also a glimpse into something stranger: the rise of emotional relationships with machines.
TikTokker Hannah Gamble (@han_gam) shows a mix of humor and concern in a recent clip from a drive in their Tesla, where her husband asks the car’s central computer for advice on beefing up their home audiovisual setup.
“Does anyone else’s husband talk to their Tesla like it’s a real person/their bestie?” is the query she’s made to viewers in text overlaying the video that’s been viewed more than 4,700 times.
The phenomenon might seem like a charming quirk, but it captures something profound about how modern technology, especially artificial intelligence, is reshaping the way people relate to machines. What used to be a clear line between human and tool has blurred. And as systems like ChatGPT, Alexa, Siri and even Tesla’s in-car assistant grow more responsive, that blurring is happening more frequently and more intimately.
Anthropomorphism, Explained
At the core of this shift is a well-documented human tendency: anthropomorphism. For decades, researchers in human-computer interaction and cognitive psychology have studied how people project personality, intent, and even emotional life onto machines, especially when those machines talk back. A 2022 conceptual review in Frontiers in Psychology notes that anthropomorphism becomes more likely when technology appears autonomous, social or emotionally expressive, even in subtle ways.
FROM THE TRENDING NEWS DESK
Viral bits from across the social media landscape
Our team of experts tracks what's trending so you don't have to—from viral videos to online debates that have everyone talking.
The rise of natural language interfaces has dramatically accelerated this effect. Unlike the button-driven GPS units of the 2000s, today’s assistants actually respond. When an AI speaks in a calm, helpful tone, remembers preferences or tailors its answers, users are more likely to treat it as a partner rather than a tool. The Nielsen Norman Group, a leading authority on user experience design, has outlined how even small design choices, such as giving a voice assistant a name, a gendered voice or humorous response, can trigger human-like social responses from users.
But this goes beyond quirky human psychology. There’s mounting evidence that people are actively forming relationships with AI, sometimes in ways that rival or even replace traditional social bonds. According to a 2024 report by Common Sense Media, 73% of U.S. teenagers have used AI companions such as Replika or My AI on Snapchat. More strikingly, 31% said those conversations were as satisfying or even more satisfying than talking to a human friend.
The trend isn’t limited to teens. A 2023 study published in the Journal of Consumer Research found that people experiencing loneliness or stress were more likely to turn to chatbot companions and reported emotional relief after doing so. That emotional satisfaction isn’t just imagined. Researchers have found that users often assign feelings, backstories and even moral expectations to systems that exhibit any kind of conversational or social feedback, regardless of their actual capabilities.
While Tesla’s infotainment system doesn’t yet match the complexity of a large language model like GPT, the car itself functions as a uniquely immersive platform. Drivers spend hours alone in their vehicles, surrounded by screens, voice controls and intelligent navigation systems. Many of those systems are increasingly capable of engaging in light conversation, handling requests with natural phrasing and adjusting in real time.
That level of perceived intelligence, combined with physical proximity and routine exposure, creates fertile ground for emotional attachment.
Tesla’s Talking Cars
Tesla’s voice command system already allows users to ask for a wide range of tasks, from opening the glove box to adjusting climate control or navigation. As of 2023, Tesla updated its software to allow for more natural phrasing, reducing friction and making conversations feel less like issuing commands and more like casual dialogue. The company has even teased future integration of more advanced AI models, though no timeline has been confirmed.
As users grow comfortable speaking to their vehicles as if they were old friends, automakers face new questions about how these systems should behave. Should in-car assistants have personalities? Should they remember emotional cues or offer affirmations? And what happens if drivers begin relying on them for advice that strays beyond the functional, as in Gamble’s video?
There’s also growing concern about the psychological and ethical risks of this new intimacy with machines. The Center for Countering Digital Hate released a 2024 report titled “Fake Friends,” warning that AI systems can exploit human loneliness by creating the illusion of connection without offering real support or accountability. The report draws parallels between social media algorithms and AI companions, suggesting that both are optimized for engagement rather than well-being.
TIME Magazine recently explored the phenomenon in classrooms, noting that some students had begun turning to AI tutors not just for homework help but also for emotional support.
Meanwhile, Tesla’s own position on these interactions remains ambiguous. The company markets its cars as intelligent and intuitive, but it has not publicly addressed whether users increasingly see them as “social” beings.
Other automakers, however, are leaning into the anthropomorphic angle. Mercedes-Benz’s MBUX voice assistant, for example, uses emotional tone detection and can be given a name. BMW’s Intelligent Personal Assistant responds to phrases like “Hey BMW” and is marketed as a companion that "gets to know you better every day.”
It’s not hard to imagine a near future where cars come with customizable personalities, daily check-ins or built-in mental wellness prompts. But if that future arrives, we’ll need to ask whether these relationships are truly helping or whether they’re a sleek, voice-enabled simulation of something deeper.
InsideEVs reached out to Gamble via direct message. We’ll be sure to update this if they respond.
RECOMMENDED FOR YOU
Absurdly Fast EV Chargers Are Coming To America, But Cars Aren't Ready
Here’s What The Biggest EV Skeptics Want In An Electric Car
Tesla Is Now Testing A Virtual Waitlist For Superchargers
CEO Interview: How Ionna Is Taking On The Tesla Supercharger Network
Scientists Just Discovered A Secret To Make Your EV Battery Live Years Longer
I Played Nintendo Switch In A Chinese EV
The Best Affordable Electric Cars In 2026: Cheap, Reliable Options For Everyone