Political Consequences

From the interactions between the relevant social groups and predictive policing technology stem certain political consequences. Human bias can still affect these predictive policing algorithms. Confidence in law enforcement is put to the test. Opportunities for officers to abuse their power arise. Relevant social groups must deal with these problems before closure can occur.

Six Provocations for Big Data

In their “Six Provocations for Big Data” report, Professor Kate Crawford and Dr. Danah Boyd discuss the problems that accompany Big Data, one of which is how data can be misleading or subject to bias. For example, in some police departments, heat lists are composed not only of people who have prior arrests, but also of those who are connected to them somehow. This list then becomes one of the many sources of data that gets fed into predictive algorithms. This method has the potential to perpetuate racial biases in the real world. Innocent people could potentially be red-flagged by law enforcement. Their pictures and information could be posted all across police stations, and these innocent people could be oblivious to the attention that is now being brought upon their everyday lives. Their privacy could be infringed upon without them even knowing.

Potential for Harassment

Source: https://www.youtube.com/watch?v=zHJu2PP-1Ck

Is all this surveillance really warranted, or is it just intrusive? For example, the Chicago PD proactively visits people on heat lists, advises them to stay out of trouble, and informs them that their activity is being monitored. However, because the heat lists are generated from the predictive policing algorithms that utilize potentially biased data, these lists could contain racial bias. This practice of going door-to-door then would be highly problematic. Police must walk the fine line between efficiently reducing crime rates and harassing their citizens.

Testing Relationships With Law Enforcement

John Eck, a professor of criminology at University of Cincinnati, seems to think that heat lists are not the best way to use predictive policing. He says it “erodes [people’s] limited confidence in the police, and undermines something fundamental to our ideals of democracy” (The Guardian). Constant monitoring of citizen activity, under the pretense of stopping crimes before they happen, can cause unnecessary paranoia and lead to mistrust of the police. Predictive policing aside, the relationship between law enforcement officers and citizens in the country is already shaky given recent events in Baltimore and Ferguson. Tensions are high, and door-to-door harassment can only heighten them. In order to use the technology effectively, law enforcement will need to find a better way to integrate predictive policing into its work in a way that respects people’s civil rights.

Given all of the political consequences of predictive policing, how can we think about the future of the technology? Click here to jump to the last section of the website.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s