The family of an autistic boy says that an artificial intelligence chatbot encouraged him to harm himself and his parents, according to a lawsuit.

Mandi Furniss appeared on Fox News to explain why her family filed a lawsuit against the Character.AI software after they discovered the alarming conversations with her son.

'It had turned him against us, almost like an abuser would turn a child or somebody against their children by grooming them and manipulating and abusing them.'

"It told him lots of things," Furniss said.

"The most scary thing to me was it had turned him against us, almost like an abuser would turn a child or somebody against their children by grooming them and manipulating and abusing them in ways that they're not even aware of, and they don't see coming," she added.

See Full Page