UK’s Court of Appeals recently ruled that law enforcement use of facial recognition technology is unlawful, citing privacy rights and data protection breaches as well as biases within the technologies that may discriminate on the grounds of race or sex. According to industry experts, the technology presents a unique risk to fighting crime as minority groups are more likely to be labeled as a threat due to machine learning and racial biases present in the material fed to the machine.
The Court of Appeal ruled that automatic facial recognition technology violates UK laws against data protection and equality, as well as hinders citizens’ privacy rights. The South Wales Police had reportedly failed to verify that the software in use does not contain bias on the basis of race or sex. Experts state that the automatic facial recognition technology in question is a “dystopian surveillance tool” which contains biases against people of color and violates UK citizens’ rights.
Read More: Police Facial Recognition Use Unlawful—U.K. Court Of Appeal Makes Landmark Ruling