Dive Brief:
- Children and teens will soon be banned from using a primary chat feature on a popular app — Character.AI — that allows users to develop relationships with artificial intelligence companions, the company announced Wednesday.
- Character.AI said it will phase out the ability for users under 18 to engage in an open-ended chat with AI characters on its platform by Nov. 25. The move comes at a time when youth media safety and mental health organizations are raising red flags about AI social companion apps.
- The move also comes as multiple lawsuits are pending against Character.AI from the Social Media Victims Law Center on behalf of families who allege the AI companion app led to their children’s suicide or attempted suicide.
Dive Insight:
Recent research has signaled the emergence of problematic relationships some students are developing with AI companion tools. And youth advocates have strongly advised that anyone younger than 18 avoid apps like Character.AI for posing serious mental health risks to students.
Some 42% of students said they or their friends used AI for mental health support, as a friend or companion, or as a way to escape from real life during the 2024-25 school year, according to a survey by the nonprofit Center for Democracy & Technology. Almost 20% of students reported using AI to have a romantic relationship, according to the CDT survey results released in October.
A separate July survey by Common Sense Media also found that 1 in 3 teens said they’ve used AI companions “for social interaction and relationships, including role-playing, romantic interactions, emotional support, friendship, or conversation practice.”
To keep users younger than 18 from Character.AI’s primary chat feature, the company said it would roll out an "age assurance functionality." In addition, Character.AI said on Wednesday that it will launch and fund the AI Safety Lab, an independent nonprofit focused on safety innovations for new AI entertainment tools.
And while teens won’t be able to use the company’s open-ended AI chat, they will soon be able to access other ways to use the app, for example, by creating videos, stories and streams with AI characters.
Character.AI said all these changes are happening “in light of the evolving landscape around AI and teens.” The company added that recent news reports and regulators have raised questions about how open-ended AI chat in general might impact teens “even when content controls work perfectly.”
“We want to set a precedent that prioritizes teen safety while still offering young users opportunities to discover, play, and create,” Character.AI said in its Wednesday announcement. “We will continue to collaborate with safety experts, regulators, and other stakeholders to ensure that user safety remains paramount as we develop innovative new features that foster creativity, discovery, and community.”
Youth mental health experts say the rampant AI companion use among students adds urgency for schools to develop a districtwide AI strategy that also includes lessons on AI literacy.
National Parents Union’s President Keri Rodrigues, in a Wednesday statement, called for accountability and laws in this area, including “guardrails with teeth that put children’s safety and mental health first” — and before profit.
“Character.AI’s move to kick kids under 18 off its platform is a good start, but let’s be clear: Silicon Valley doesn’t get a gold star for finally doing the bare minimum,” Rodrigues said. “Parents have been shouting from the rooftops about the dangers of unregulated AI, and this decision shows the tech industry is only now waking up.”