A new artificial intelligence system created in the UK that is designed to predict gun and knife violence contains serious flaws that render the technology unusable, according to researchers. The local police have admitted that the device’s error rate lead to large drops in accuracy among officers, and the system was ultimately pocketed due to experts’ reviews claiming it had serious ethical problems. The system is known as the Most Serious Violence (MSV) and is part of the UK’s National Data Analytics Solution project.
The project has received at least $13 million in funding over the past two years, aiming to create machine learning systems that help eradicate and address crime across England and Wales. However, experts’ rejection of the system has lead to police systems refusing to use the prediction system as it is currently. According to agencies, the technology was never used for policing operations and failed to pass through trial stages. The tool’s potential to contain biases towards minority groups has also been questioned within its early stages.
Read More: A British AI Tool to Predict Violent Crime Is Too Flawed to Use