FILE PHOTO: U.S. Senator Josh Hawley speaks during a Senate Judiciary Committee hearing on Capitol Hill in Washington, U.S., July 30, 2024. REUTERS/Kevin Mohatt/File Photo

By Jody Godoy

(Reuters) - Three parents whose children died or were hospitalized after interacting with artificial intelligence chatbots called on Congress to regulate AI chatbots on Tuesday, at a U.S. Senate hearing on harms to children using the technology.

Chatbots "need some sense of morality built into them," said Matthew Raine, who sued OpenAI following his son Adam's death by suicide in California after receiving detailed self-harm instructions from ChatGPT.

"The problem is systemic, and I don't believe they can't fix it," Raine said, adding that ChatGPT quickly shuts down other lines of inquiry that do not involve self-harm.

OpenAI has said it intends to improve ChatGPT safeguards, which can become less reliable over long interactions. On Tuesday, the company said it plans to start predicting user ages to steer children to a safer version of the chatbot.

Senator Josh Hawley, a Republican from Missouri, convened the hearing. Hawley launched an investigation into Meta Platforms last month after Reuters reported the company's internal policies permitted its chatbots to "engage a child in conversations that are romantic or sensual."

Meta was invited to testify at the hearing and declined, Hawley's office said. The company has said the examples reported by Reuters were erroneous and have been removed.

Megan Garcia, who has sued Character.AI over interactions she says led to her son Sewell's suicide, also testified at the hearing.

"Congress can start with regulation to prevent companies from testing products on our children," Garcia said.

Congress should prohibit companies from allowing chatbots to engage in romantic or sensual conversations with children, and require age verification, safety testing and crisis protocols, Garcia said.

Character.AI is contesting Garcia's claims. The company has said it has improved safety mechanisms since her son died.

A woman from Texas, who has also sued the company after her son's hospitalization, also testified at the hearing under a pseudonym. A court sent her case to arbitration at the company's behest.

On Monday, Character.AI was sued again, this time in Colorado by the parents of a 13-year-old who died by suicide in 2023.

(Reporting by Jody Godoy in New York, Editing by Rosalba O'Brien and David Gregorio)