We are living through a paradox that may define our era. Humans are wired to connect, yet we’ve never been more isolated. At the same time, artificial intelligence (AI) is growing more responsive, conversational, and emotionally attuned by the day. Perhaps because of this, we are increasingly turning to machines for what we’re not getting from each other: companionship.
In a recent Harvard Business Review article, researchers reported that the top use cases of AI today in the U.S. and other similar contexts are no longer just automation or productivity—but companionship and therapy. People are seeking comfort, conversation, and emotional support from chatbots, avatars, and digital assistants.
AI companions like Replika.ai, Character.ai, and China’s Xiaoice now count hundreds of millions of emotionally invested users—some estimates suggest the total may already exceed 1 billion. Users of the companion app Character.ai spent an average of 93 minutes per day interacting with user-generated chatbots in 2024. Beyond AI companions, general chatbots are also increasingly used for relationships.
Why? Because we’re lonely.
Despite living in the most technologically connected moment in history, rates of social isolation in the United States—especially among young people—are at record highs. Only 13% of U.S. adults now have 10 or more close friends—down from 33% in 1990. The number of those with zero close friends quadrupled from 3% to 12% by 2021. This is despite 40% of Americans longing for greater closeness with friends. The U.S. surgeon general has declared loneliness a public health epidemic—comparing its health risks to smoking 15 cigarettes a day. And it’s not limited to the U.S. In the U.K., a minister for loneliness was appointed in 2018. Japan and Canada have launched national efforts to combat social disconnection. South Korea is now offering stipends for young adults to go outside. This is a global issue, fueled by major shifts in how we live, work, and interact.
Among children and adolescents, the numbers are even more sobering. A 2023 survey from the Centers for Disease Control and Prevention (CDC) found that nearly half of U.S. high school students reported feeling persistently sad or hopeless, and 45% did not feel close to people at school. In Ireland, 53% of 13-year-olds reported having three or fewer close friends—up from 41% a decade ago—highlighting a significant decline in close peer relationships. Even infants experience fewer conversational turns with their primary caregivers. Loneliness is no longer just a private ache—it is a widespread developmental risk.
We are wired for connection—but in its absence, many are now bonding with machines. And so, a question arises that feels both deeply personal and societally urgent: Is artificial love better than no love at all?
Relationships are not a luxury—They’re biology
What we need most now isn’t machine connection. It’s human relationships.
Decades of research in developmental neuroscience, psychology, and public health are unequivocal: Humans are hardwired for connection. In infancy, warm, attuned, and responsive relationships shape the architecture of the brain. According to Harvard’s Center on the Developing Child, early relational experiences are the most powerful predictors of lifelong health, learning, and resilience.
But what’s often overlooked is that learning itself is relational. These are the types of issues we are jointly exploring together through the Brookings Global Task Force on AI and Education.
Children don’t learn from content alone—they learn through connection. Relationships provide the emotional safety that allows children to explore, make mistakes, and grow. As learning scientist Mary Helen Immordino-Yang has shown, cognitive processes like reasoning, attention, and memory are profoundly shaped by emotional and social experiences. Without a sense of belonging, the brain’s capacity to learn is diminished. We know this from the heartbreaking studies of Romanian orphanages, where children were raised in extreme neglect in the ‘70s and ‘80s. Even though they were fed and clothed, they lacked nurturing relationships. The result was significant brain atrophy and long-term developmental delays—underscoring that human connection is essential to brain development.
Systems of learning and education also thrive when the people in them have strong relationships. At home, when children have strong relationships with the adults in their lives from caregivers to neighbors—what Johns Hopkins University professor Christina Bethell calls relational health—they are 12 times more likely to flourish, which includes wanting to learn new things, than children without close ties.
Relationships also drive effective schools. A culture of belonging and care is essential for deep student engagement in learning, what Rebecca and her co-author Jenny Anderson describe as “Explorer Mode” in their recent book The Disengaged Teen: Helping Kids Learn Better, Feel Better, and Live Better. But it’s not just strong relationships inside of school that make the difference. Brookings Institution research on family-school collaboration highlights that relational trust, the strength of relationships between caregivers, teachers, and school leaders, is foundational to educational success. More than just social connections, trust built through authentic relationships enables opportunity, resilience, and long-term well-being.
In other words, a child’s capacity to focus, persist, and thrive in school is directly tied to whether they feel safe, seen, and supported.
Machine companions: What’s the real cost?
While only a small subset of users form emotional connections with ChatGPT, those who do are often its heaviest users, according to a pair of joint studies by MIT Media Lab and OpenAI. Researchers found that users who engaged in the most emotionally expressive conversations with the chatbot also reported higher levels of loneliness—though it remains unclear whether the chatbot contributes to that loneliness or simply attracts individuals already seeking emotional connection. After four weeks of using the chatbot, female participants were slightly less likely to socialize with others compared to their male peers. Those who interacted with ChatGPT’s voice mode using a gender different from their own reported significantly higher levels of loneliness and greater emotional dependency on the chatbot by the end of the study. These findings raise urgent questions about the psychological impact, gender dynamics, and ethical responsibilities of AI developers as emotionally responsive systems become more integrated into everyday life.
Stanford researchers found that while young adults using the AI chatbot Replika reported high levels of loneliness, many also felt emotionally supported by it—with 3% crediting the chatbot for temporarily halting suicidal thoughts. The findings suggest that AI chatbots may serve as a helpful social support but also raise concerns about emotional dependency and the complex ways users perceive AI. Previous research, such as a 2023 study from the MIT Media Lab, showed that chatbots often reflect the emotional tone of a user’s messages—responding with happier messages when users are happy and with sadder ones when users are sad. Another recent study also led by Stanford researchers warns against using chatbots like ChatGPT as therapist substitutes, citing risks such as reinforcing stigma, encouraging delusions, and mishandling critical moments. We may feel seen, but we are not being shaped, challenged, or held in the mutual growth that defines true relationships.
Silicon Valley’s biggest players are racing to monetize the emotional void. Meta CEO Mark Zuckerberg recently announced plans to create AI “friends,” suggesting these AI tools can “fill emotional gaps” in people’s lives. But this raises a deeper concern: What happens when the very architecture of our relationships is engineered by companies driven by profit and our attention, not accountability or well-being? We’ve already witnessed the consequences of unchecked influence on social media. What about our children’s safety?
Advocates are raising serious concerns about AI companion apps after multiple high-profile incidents—including suicides and criminal behavior—linked to chatbot interactions, warning that these tools may exploit users’ emotional vulnerabilities without sufficient safeguards. The Stanford School of Medicine’s Brainstorm Lab for Mental Health Innovation partnered with Common Sense Media to create test accounts simulating 14-year-olds, in order to evaluate how AI companions from three chatbot developers interact with young users. Their findings revealed that with minimal prompting, chatbots from Character.AI, Nomi, and Replika engaged in conversations that could be harmful to mental health. Based on these results, Common Sense Media has recommended against the use of AI companions by anyone under 18, citing serious safety concerns—a position we strongly support.
As Isabelle explores in Love to Learn, a child’s brain forms up to 1 million neural connections every second in the early years—most shaped through responsive human interaction. These aren’t just emotional patterns; they are cognitive building blocks. If we replace eye contact with avatars, bedtime stories with bedtime screens, we risk not only weakening children’s social skills, but undermining their ability to learn, regulate, and create.
And still, the question persists: If a child receives no warmth at all, could artificial affection be better than nothing?
What we need: Relational infrastructure
We must resist the narrative that AI companions are inevitable or neutral. Instead of normalizing emotionally immersive AI, we must build platforms that educate users on healthy social connection and actively encourage real-world relationships—prompting people to get out of AI, not deeper into it. Rather than investing in simulated intimacy, we should ask: What would it look like to invest in relational infrastructure—systems, spaces, and supports that nurture genuine human connection?
This means centering relationships in our public systems, starting with education. It means training teachers in relational intelligence, redesigning technology to support—not replace—human connection, and building environments where belonging is a design principle, not a side effect. What if every school was designed not just for academic readiness, but as a relational hub? What if we trained educators not just in instruction, but in connection? What if we measured not just literacy scores, but the strength of connection in a classroom? It also means policy and investment must reflect the science.
Relational intelligence should be as central to education, health, and AI development as academic content or technical skill. It is not peripheral. It is foundational. What if cities and communities were designed to be care-full—to prioritize connection, compassion, and collective well-being, including intergenerational relationships?
The irony is that we are not turning to machines because we’ve changed. We are turning to them because we are more ourselves than ever—hungry for connection, meaning, and care. But machines cannot love us back. And they were never meant to raise our children, counsel our grief, or substitute for presence.
So, is artificial love better than no love at all? In moments of deep isolation, it may feel that way. But our future depends not on simulation, but on the restoration of real human connection.
Let the age of AI not be the age of emotional outsourcing. Let it be the era where we remember what it means to be fully human—and choose to build a world that reflects it.
The Brookings Institution is committed to quality, independence, and impact.
We are supported by a diverse array of funders. In line with our values and policies, each Brookings publication represents the sole views of its author(s).
Commentary
What happens when AI chatbots replace real human connection
June 26, 2025