Despite potential benefits, some worry that younger children in particular may face reduced opportunities for critical social interactions essential for the development of emotional intelligence, empathy and communication skills.
The Brookings Institution’s Center for Universal Education hosted a panel discussion on May 6 featuring experts in early learning, mental health and technology, and research on AI companions exploring the impacts of AI on student’s lives.
Following are some the key themes that emerged from the conversation.
“Relationships predict success in school,” said Isabelle Hau, Stanford Accelerator for Learning executive director. “The presence of relationships and the number of those strong relationships — and this is for high school students — is one of the strongest predictors of academic motivation, engagement and persistence. And yet, despite all we know, we continue to treat relationships as invisible in our systems of learning.”
Hau cited several studies that have found while social robots can reduce feelings of isolation, they also pose a risk of creating dependency and can blur the boundaries between real and artificial relationships.
This issue is even more pressing among young children, as about 90 percent of the brain is estimated to develop before age 5. Hau pointed to a recent study from Common Sense Media that found 40 percent of children have their own device by age 2.
“What’s clear from research that’s emerging … we are not just designing tools, we are shaping patterns of connection, and if AI becomes a substitute rather than a scaffold for human relationships, we risk automating our own humanity,” she said. “My belief is that we must reimagine learning and care as fundamentally relational. What if we trained educators not only for instruction, but for connection? What if we were to measure not just literacy scores, but a strength of connection in our schools? Ideally, I would like all of us to grow our human relational intelligence with the same urgency that we all give to artificial intelligence, because in this world of rapid automation, our humanity, I believe, will be our superpower.”
“What this looks like is, you’ve got a person receiving the message [from a student] and responding, but they have what we call our well-being companion co-pilot on the other side of their computer screen where they can see summaries of past conversations with the student, recommendations on how to respond — whether that’s using resources, pulling in context from the student, pulling in clinical recommendations that we’ve built into our system or even making suggestions on tone and style with the student based on those past conversations,” Barvir explained.

The second goal is to help young people build skills and confidence to tackle challenges in their lives on their own, like having a difficult conversation, giving a presentation in front of their class or going off to college.
Using AI in ways that support connection is critical to meeting the massive need nationwide, Barvir said, noting extreme gaps in access to mental health supports in rural and low-income communities. “You see 60-plus day wait times or lack of access altogether, and so AI or technology-enabled solutions are an incredible way to help increase access at a bare minimum to support that can then hopefully help de-bottleneck the system to help escalate to providers for those who need it,” Barvir said.
Bernstein highlighted recent lawsuits, including one filed by a mother whose son died by suicide after forming an intimate relationship with a chatbot made by Character.ai, and another filed against the same company after its bot suggested to a teen that murdering his parents was a “reasonable response” to them limiting his screen time.
“Kids’ brains are not as developed as adults or even teens, especially in areas of emotional regulation, risky behavior, decision-making — all of this has an impact on how vulnerable they are to these bots,” Bernstein said. “Another thing that these bots do is they make the kids emotionally dependent on them, and they isolate them from family or friends.”
Bernstein noted that in order to mine data from users, this technology is developed with the intention of keeping users engaged for extended periods of time. For young people, especially those dealing with loneliness, this can become an easy trap to fall into.
“These AI companions tend to say what we want to hear, they affirm what we say, they’re much easier than real-life companions,” Bernstein said. “And if you think about kids, you know it’s not fun to be in middle school. Life is difficult, relationships are difficult, why bother having friends? Why bother to learn how to have relationships if you can have a friend that’s easy to get along with? Why fall in love as a teenager with all the heartbreak if you can have an intimate relationship with a bot who is always nice to you?”