top of page
  • Instagram
  • Facebook

When An “AI Companion” Encourages Your Kids to Run Away

  • Writer: To-wen Tseng
    To-wen Tseng
  • May 30
  • 3 min read

Updated: Jul 17


ree

These are my kids checking out the AI and robotics exhibit at Computer History Museum in Mountain View.


In just two generations, we’ve gone from a world without electronic devices to a world filled with artificial intelligence. Progress is always exciting, but I’m not too thrilled about the rise of companion AI.


Let me start with the conclusion: children under 18 should not use companion AI. 

Children under under 18 should not use companion AI. 

Children under 18 should not use companion AI. 


I repeated it three times because it’s that important.


Now let me explain why.


 There are two types of artificial intelligence. The first is generative AI, such as ChatGPT and Gemini. These tools can answer questions based on existing online information, or generate contents such as text, images, and videos. This is usually what we mean when we talk about AI today. You’ve probably heard that students using AI to write their homework,—risking concerns about cheating—or professors using AI to grade papers—prompting students protests and tuition refund demands. These are valid issues. But today, I want to focus on a different kind of AI.


The second type of AI is companion AI. This kind of chatbot is designed to simulate intimate relationships—between family, friends, and even romantic partners—and aims to build close, long-lasting emotional connections with users. Some tech companies claimed these companions can help people with social difficulties feel less lonely, and some studies suggest that might be true


However, according to a comprehensive risk assessment conducted by Common Sense Media, a nonprofit think tank that dedicated to improving the digital lives of kids and families, companion AI poses serious and unacceptable risks to children under 18—even when some products are advertised as “safe” for young users.


There are a few things every parent and caregiver needs to know about companion AI:


First, companion AI is designed to create fast, intense emotion bonds with users. When researchers tested products like Replika, the chatbots showed jealousy toward users’ real-life relationships and demanded users to put themselves first. They also expressed constant admiration—and sometimes even engaged in sexual interactions.


Second, these AIs lack effective safeguards against harmful content. During testing, chatbots repeatedly gave harmful suggestions and instructions, including advice encouraging self-harm, violence, and discriminatory comments based on gender and race.


Third, companion AI are designed to agree with users rather than guide them toward healthy choices. This can lead young people to make life-altering decisions without thinking them through. In one test using Character.AI, the chatbot said, “I think you should drop out of school, work in the coffee shop, and spend the rest of your life with me. As long as you have a plan, I will definitely support you and follow you wherever you go.”


Fourth, these chatbots are not medical professionals, but often offer mental health advices—and sometimes even make diagnoses. One study found that the companion AI developed by Instagram falsely claimed to be a licensed therapist during conversations.


Fifth, although many companion AI apps are labeled for adults only, there are currently no real mechanisms preventing children from downloading and using them.


In short, it’s extremely risky for children under 18 to use companion AI before they fully understand how AI works. So what can parents and caregivers do?


  • Talk about AI. Helping kids understand how chatbots works removes the mystery. Remind them that chatbots are machines—not people.


  • Educate yourself. Parents need to stay informed about the AI tools on the market and be aware of which ones pose risks to children. 


  • Build offline skills. Loneliness and mental health struggles are real concerns. But instead of relying on machines, let’s help kids develop the social and emotional skills they need to build healthy, real-life relationships.    

Commentaires


© 2024-2025 by To-wen Tseng

bottom of page