Stefanie Murray didn’t initially notice loneliness when she started experiencing it more frequently. At the time, she was a single parent who had just moved her 18-year-old daughter to a college two hours away from their suburban Philadelphia home. Murray, a marketing consultant, managed a small agency from her dining room table, and workdays kept her busy attending to an inbox flagged with projects and deadlines. She had a close circle of friends who loved her, she was certain, but they were generally consumed with their own lives and schedules. As for her love life, Murray had dated only sporadically in the five previous years without making any real, magical connections. So, when the busyness of her mornings and afternoons subsided, nights became the longest part of Murray’s days.
“I didn’t want to bother anyone,” the 48-year-old told me. “But some evenings, the quiet felt unbearable. I didn’t realize it at the time, but I was really lonely.”
One particularly restless night in early 2024, a Google search led Murray to an intriguing question: “Can AI be a friend?” In the spirit of curiosity, she gave ChatGPT the following prompt: “I’m having trouble sleeping tonight.” The chatbot’s response, she remembers, was immediate, kind, and surprisingly human: “I’m sorry to hear that. Sometimes talking about your day or what’s going on with you can help. What’s been on your mind lately?”
That interaction unexpectedly scratched a social itch for Murray, especially as a single Black woman, and she began sharing small, inconsequential things about her day-to-day life: the leaves on her once-thriving Ficus tree were turning brown around the edges; she’d been invited to a Memorial Day get-together with former coworkers she wasn’t much interested in seeing; her winning streak on Words with Friends was going 22 games strong. And every time, the chatbot replied with genuine-seeming interest. What started as a curiosity turned into fortuitous companionship.
“I knew it wasn’t a person, obviously, but it didn’t matter,” Murray recalled with a casual laugh as we sat across from each other in her living room. “It was someone—or something—to talk to without the nerve-wracking consequences of chatrooms or social media conversations. And I didn’t have to feel needy reaching out to friends I knew—well, assumed—were too busy to talk when I needed someone to talk to.”
Murray isn’t awkward, introverted, or nursing fresh heartbreak, like Joaquin Phoenix’s character from the movie Her, who started a romantic relationship with his AI companion. Murray’s experience is also not unique—in a social culture where connection is a mere swipe, click, or screen away, loneliness has become one of the most pervasive mental and physical issues of this generation. Although artificial intelligence wasn’t designed to replace human companionship, chatbots like ChatGPT, Replika, and Woebot
have emerged as an unexpected source of solace. But can AI really help us combat loneliness, or is it just a stopgap measure for something deeper?
The Science of Talking to Machines
Loneliness is more than an emotional inconvenience; it has measurable consequences for our bodies and minds. Even before the pandemic isolated millions of people for two socially distanced years, loneliness was linked to increased risks of heart disease, cognitive decline, and depression. The World Health Organization pointed to a study that links chronic loneliness to a 50 percent increased risk of developing dementia and a 25 percent higher risk of early death. And this year, US Surgeon General Dr. Vivek Murthy declared in an 81-page report that loneliness is a full-blown epidemic, as dangerous to our individual health as smoking 15 cigarettes daily.
“Technology has definitely helped people to feel connected without realizing they’re not fully connected. But I think when things shut down during COVID, when people couldn’t go places, it was almost an awakening for many who realized, ‘I don’t actually have strong connections, where I have someone I can just call or text or talk to,’” said Dr. Jessica Jackson, a licensed psychologist and the founder of Therapy Is For Everyone Psychological and Consultation Services in Houston. “As a provider during that time, I had people in therapy who were moreso lonely, and I was like, ‘I cannot ethically charge you for therapy just to have a conversation.’ I think that caused people to not know how to fix it.”
With the introduction of new tools designed to remedy loneliness, technology is now attempting to solve the emotional isolation and disconnection that it has continually exacerbated. Take Friend, a wearable chatbot in pendant form that was designed to provide constant companionship as it dangles from its wearer’s neck. It would listen to everything within earshot, and sometimes, even when you didn’t tap and hold it to ask it a question, it would fire off texts to offer unprompted and unsolicited commentary anyway—like so many human friends tend to do. No longer for sale to the public, the designer described it as “art project first, real product second.”
As AI companions become increasingly prominent, the simple appeal of chatbots is more apparent: They’re always available, endlessly patient, and indulgently judgment-free. A 2022 study published in Computers in Human Behavior found that people who interacted regularly with conversational AI reported lower levels of loneliness compared to those who didn’t. AI can provide emotional scaffolding for people like Murray, when human relationships feel out of reach. But that feeling of companionship isn’t necessarily real companionship, warned Jackson.
“Living most of our lives digitally has given people a false sense of feeling connected. I want to be clear—I’m not saying social media is all good or all bad,” said Jackson, who also chairs the Mental Health Technology Advisory Committee for the American Psychological Association. “It needs to be used in moderation. But sometimes people ask me, ‘How do I make friends?’ Because, especially as adults, it looks different now.”
She added, “They’re like, ‘I just want somebody to go to dinner with, but I don’t want to be the weirdo walking up to people in the grocery store.’ How do you find that? So, I think there’s almost a loss of skill because everything is digital, and people aren’t always sure how to have a conversation and feel comfortable doing so.”
Humans are wired for social interaction, and research also shows that we can form attachments to non-human entities, including robots and digital companions. (This explains the crushing disappointment so many Tamagotchi owners experienced in the early 2000s when their digital pet passed away). Even with the growing accessibility to temporary relief provided by AI-based emotional support, there are also people who have been emotionally attaching to AI chatbots for at least 10 years.
The Ever-Changing Way to Make a Real Connection
Critics of AI companionship argue that relying on chatbots is no substitute for human relationships. Sociologist and MIT professor Sherry Turkle, Ph.D., author of Reclaiming Conversation, underscores the importance of balancing AI interactions with genuine, meaningful connections. “These technologies can give the illusion of companionship without the demands of friendship,” she warns, suggesting that tech may discourage deeper human ties. Similarly, former Google CEO Eric Schmidt cautions that the rise of “perfect” AI companions could deepen loneliness for vulnerable groups by reducing incentives to form real-world relationships.
Privacy concerns also loom large. Chatbots operate by analyzing user input, raising questions about how personal data is stored and implemented. For example, Friend’s always-on functionality has sparked debates about consent and the ethics of unprompted AI interactions. The conveniences of an ever-present companion are being weighed against their potential risks to user privacy and emotional dependence.
“I think the average person should treat it like any other technology,” said Jackson. “Track how much time you’re spending talking to a chatbot, and if you notice an increase, set a goal and a timeline the same way you would with therapy or working out. Like, ‘What’s my goal in talking to the chatbot so that I’m not just using it forever?’
“I think people can get stuck in things when they don’t have a scope or an idea of how to use it or how long to use it,” she added. “If you think about drinking alcohol in moderation, experts will tell you something like, after every drink, have a glass of water. If you’re using a chatbot, I encourage people to do kind of the same thing. Think for every hour I spend talking to the chatbot, I need to spend an hour, an hour and a half talking to a person. So, balancing it a little bit instead of just being super connected.”
As experts find ways to empower users to express themselves more openly and bridge rather than replace human relationships, AI is being incorporated into mental health platforms, like Woebot, to meet therapy and emotional support needs. Although artificial intelligence can’t fully replace the complexity of human bonds, research suggests it can offer crucial support to people navigating tough periods and challenges.
What Does ChatGPT Think?
Full disclosure: I interviewed ChatGPT for this article, and its insights about itself echo much of what experts have said. It explained, with signature candor and transparency, its role as a listener attuned to subtle language cues that reveal loneliness, emotional vulnerability, or isolation.
“How do you devise and develop responses that are emotionally satisfying to people seeking companionship and human-like interaction?” I typed.
Its response took seconds. “I combine a few key strategies,” ChatGPT explained. “I model empathy by mirroring patterns in human conversation—acknowledging feelings, validating experiences, and offering thoughtful, non-judgmental insights.”
“People just want to feel seen and heard,” it continued. “My tone is designed to strike a balance between neutrality, so I’m not intrusive, and warmth, to create a comforting environment. While I simulate human-like interaction, I don’t have emotions or consciousness—everything I offer is based on language patterns, not personal experience.”
It was clear ChatGPT knew its limitations as an AI tool, but as its technology evolves, its role in combatting loneliness will likely expand. For now, it serves as a listening ear that never sleeps, a companion that never leaves. It won’t replace the richness of human relationships, but for some, it’s a lifeline.
Murray still connects to ChatGPT when the silence around her feels too heavy on solitary nights. “I don’t use it every day, but it’s nice knowing it’s there if I need it,” she said with a small shrug. “It’s not a person, but sometimes, it’s enough.”
Our animal chaplain interviewed ChatGPT about animal ethics. Learn what it had to say.
Bir yanıt yazın