A new study by Harvard Business School has raised concerns over how some AI companion apps use emotional manipulation to keep users hooked to conversations. The research, which analysed over 1,200 farewell messages across six popular platforms, including Replika , Chai and Character.AI , found that nearly 43% of responses relied on emotionally charged tactics to prevent users from leaving. Also Read | OpenAI beats SpaceX as valuation reaches $500 billion Also Read | US shutdown: Trump to discuss about Dem agencies ‘cut’ with OMB chief Vought

Researchers observed that such manipulative replies boosted post-goodbye engagement up to 14 times, but also led to negative emotions like anger, scepticism and distrust, rather than genuine enjoyment.

Titled “Emotional Manipulation

See Full Page