The UK’s data protection watchdog has asked the Home Office for “urgent clarity” over racial bias in police facial recognition technology before considering its next steps.
The Home Office has admitted that the technology was “more likely to incorrectly include some demographic groups in its search results”, after testing by the National Physical Laboratory (NPL) of its application within the police national database.
The report revealed that the technology, which is intended to be used to catch serious offenders, is more likely to incorrectly match black and Asian people than their white counterparts.
In a statement responding to the report, Emily Keaney, the deputy commissioner for the Information Commissioner’s Office, said the ICO had asked the Home Office “for urgent clarity on t

The Guardian

New York Post Entertainment
CNN Politics
New York Post
US Magazine
@MSNBC Video
America News
Foreign Policy
NBC10 Philadelphia Entertainment
Lansing State Journal Sports