Showing posts with label biased algorithms. Show all posts
Showing posts with label biased algorithms. Show all posts

Artificial Intelligence and Predictive Policing: A Roadmap for Research

 Link to Report

"In this report, we present the initial findings from a three-year project to investigate the ethical implications of predictive policing and develop ethically sensitive and empirically informed best practices for both those developing these technologies and the police departments using them."

To Surveil and Predict: A Human Rights Analysis of Algorithmic Policing in Canada

Link to Executive Summary

Link to Full Report

"This report examines algorithmic technologies that are designed for use in criminal law enforcement systems. Algorithmic policing is an area of technological development that, in theory, is designed to enable law enforcement agencies to either automate surveillance or to draw inferences through the use of mass data processing in the hopes of predicting potential criminal activity....

In order to guide public dialogue and the development of law and policy in Canada, the report focuses on the human rights and constitutional law implications of the use of algorithmic policing technologies by law enforcement authorities."

Big Data and Criminal Justice - What Canadians Should Know

"On its surface, the term ‘big data’ refers both to very large data sets, as well as the tools used to manipulate and analyze them. This concept, however, does not just refer to the harvested information – it also refers to the motivations behind what harvesting that information is supposed to achieve. When data is collected en masse, and algorithms (a series of instructions that tell a computer what to do) cross reference data both within and between datasets, the computational software processing the data identifies patterns within them. It is this notion of “identifying patterns” that serves
as the backbone of predictive justice.

Predictive justice uses data on past occurrences or behaviours to make decisions about the future, such as who and where will be policed, how an individual should be sentenced given the risk they pose to others, and when someone should be released from prison....

Unfortunately, there has been a lack of both awareness and scholarship regarding how this technology is being employed across Canadian police departments, justice agencies, and courts....

These predictive technologies are appealing because they claim to make justice a speedier, more egalitarian affair; they take complicated and potentially-biased discretionary decisions – such as who to police and who to assess as “higher-risk offenders” - and reduce these decisions to scores, numbers, or dots on a map. In so doing, these technologies can incur greater costs than benefits."

Read on...

Predictive Policing is Tainted by "Dirty Data," Study Finds
"A new study from New York University School of Law and NYU's AI Now Institute concludes that predictive policing systems run the risk of exacerbating discrimination in the criminal justice system if they rely on 'dirty data.'




Law enforcement has come under scrutiny in recent years for practices resulting in disproportionate aggression toward minority suspects, causing some to ask whether technology – specifically, predictive policing software – might diminish discriminatory actions.

However, a new study from New York University School of Law and NYU's AI Now Institute concludes that predictive policing systems, in fact, run the risk of exacerbating discrimination in the if they rely on 'dirty data' – data created from flawed, racially biased, and sometimes unlawful practices.

The researchers illustrate this phenomenon with case study data from Chicago, New Orleans, and Arizona's Maricopa County. Their paper, 'Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice,' is available on SSRN."