Busted by Big Data: Algorithms Could Make Cities Safer - But They Can't Protect Us From Policing's Worst Instincts 
"By combining huge tranches of data and highly sophisticated algorithms, predictive policing appears to hold out the science-fiction promise that technology could, one day, spit out 100 percent accurate prophecies concerning the location of future crimes. The latest iteration of these analytics can’t ID a killer-to-be, but it can offer insight into what areas are potential sites for crime by drawing on information in everything from historical records to live social-media posts.

The technology, however, has raised tough questions about whether hidden biases in these systems will lead to even more over-policing of racialized and lower-income communities. In such cases, the result can turn into a feedback loop: the algorithms recommend a heightened police presence in response to elevated arrest rates that can be attributed to a heightened police presence.

Andrew Ferguson, who teaches law at the University of the District of Columbia and is the author of The Rise of Big Data Policing, goes further. He says that current predictive systems use social media and other deep wells of personal information to predict whether certain offenders may commit future crimes—an Orwellian scenario. Canadian governments and civilian oversight bodies, however, have done little to establish clear policies differentiating appropriate and inappropriate uses for these technologies. It is little wonder that critics are becoming increasingly concerned that police departments fitted out with big-data systems could use them to pre-emptively target members of the public. Can we really trust crime fighting to an algorithm?"

Related:

No comments: