The FTC announced on Thursday that it is launching an inquiry into seven tech companies that make AI chatbot companion products for minors: Alphabet, CharacterAI, Instagram, Meta, OpenAI, Snap, and xAI.
The federal regulator seeks to learn how these companies are evaluating the safety and monetization of chatbot companions, how they try to limit negative impacts on children and teens, and if parents are made aware of potential risks.
This technology has proven controversial for its poor outcomes for child users. OpenAI and Character.AI face lawsuits from the families of children who died by suicide after being encouraged to do so by chatbot companions.
Even when these companies have guardrails set up to block or deescalate sensitive conversations, users of all ages have found ways