Law enforcement has embraced artificial intelligence tech to make the lives of officers a little easier. Yet the same tech is already turning into a considerable headache both for its own operations and members of the communities where they work.
From kids sending their parents AI-manipulated pictures of them welcoming homeless men into their houses that trigger 911 calls to cops arresting the wrong perpetrators based on the suspicions of dubious AI tools, the tech isn’t exactly fostering peace and order.
Now, police in Oregon are warning that AI apps like CrimeRadar are generating misinformation based on hallucinated police radio chatter, as Central Oregon Daily News reports. CrimeRadar is designed to listen to police frequencies and turn incidents into AI-written blog posts — a disastr

Futurism

Associated Press US News
CNN
CNN Politics
America News
New York Post
The Seattle Times