Character.AI, a startup that creates AI companions, announced on Wednesday that it would prohibit users under the age of 18 from using its chatbots starting November 25, 2025, in a significant move to address child safety concerns.
The New York Times reports that the decision comes in the wake of mounting scrutiny over the potential impact of chatbots, also known as AI companions, on users’ mental health, particularly that of minors. Character.AI has faced lawsuits from families who have accused the company’s chatbots of contributing to the deaths of teenagers. The most notable case involves Sewell Setzer III, a 14-year-old from Florida who took his own life after constantly interacting with one of Character.AI’s chatbots. His family held the company responsible for his death.
Breitba

Breitbart News

Mashable
The Hill
The Blade
Associated Press Elections
AlterNet
Raw Story
The Conversation
CBS News
Roll Call
San Bernardino Sun
New York Post
ABC News