Predictive Policing is Tainted by "Dirty Data," Study Finds
"A new study from New York University School of Law and NYU's AI Now Institute concludes that predictive policing systems run the risk of exacerbating discrimination in the criminal justice system if they rely on 'dirty data.'




Law enforcement has come under scrutiny in recent years for practices resulting in disproportionate aggression toward minority suspects, causing some to ask whether technology – specifically, predictive policing software – might diminish discriminatory actions.

However, a new study from New York University School of Law and NYU's AI Now Institute concludes that predictive policing systems, in fact, run the risk of exacerbating discrimination in the if they rely on 'dirty data' – data created from flawed, racially biased, and sometimes unlawful practices.

The researchers illustrate this phenomenon with case study data from Chicago, New Orleans, and Arizona's Maricopa County. Their paper, 'Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice,' is available on SSRN."

No comments: