Predictive Policing Poses Discrimination Risk Think Tank Warns, but AI shouldn't be Dismissed

"The use of data analytics and machine learning in policing has plenty of potential benefits, but it also presents a significant risk of unfair discrimination, security think tank Royal United Services Institute for Defence and Security Studies (RUSI) has warned.
A new report, Data Analytics and Algorithmic Bias in Policing, outlines the various ways that analytics and algorithms are used by police forces across the United Kingdom. 

This includes the use of facial recognition technology, mobile data extraction, social media analysis, predictive crime mapping, and individual risk assessment. The report focuses on the latter two and the risks they pose, given the predictive nature of these uses.

The study notes that if bias finds is way into these technologies, it could lead to discrimination against protected characteristics such as race, sexuality or age. This is a result of human bias in the data used to train these systems." 

No comments: