Content warning: this story includes discussion of self-harm and suicide. If you are in crisis, please call, text or chat with the Suicide and Crisis Lifeline at 988, or contact the Crisis Text Line by texting TALK to 741741.
Character.AI, the chatbot platform accused in several ongoing lawsuits of driving teens to self-harm and suicide, says it will move to block kids under 18 from using its services.
The company announced the sweeping policy change in a blog post today, in which it cited the “evolving landscape around AI and teens” as its reason for the shift. As for what this “evolving landscape” actually looks like, the company says it’s “seen recent news reports raising questions” and has “received questions from regulators” regarding the “content teens may encounter when chatting w

Futurism

The San Joaquin Valley Sun
The Mercury News
New York Post Business
ABC News Video
ClickOrlando
Idaho State Journal
Raw Story
Associated Press US and World News Video
AlterNet
America News
Law & Crime
FOX 5 Atlanta Crime