Blog

Blog: Machine Learning in Crime Prevention and Criminal Justice


The 1987 film RoboCop depicts the dangers of a militaristic police force utilizing artificial intelligence to rid the Detroit streets of crime. The film Minority Report envisions a dystopian future where suspected murderers are arrested before they even commit a crime. The film Captain America: The Winter Soldier revolves around a machine learning algorithm designed to identify all potential threats of Hydra and eliminate these targets. Notice any similarities?

Over the past three decades, fears regarding the usage of machine learning algorithms to predict crime and deal with criminals have become a growing concern. Unbeknownst to most, the development of algorithms that predict crime and people’s perceived threat to society are already widely used across America. While we are not living in an Orwellian Nightmare yet, major changes must be made to ensure the fairness of the criminal justice system and to prevent any further widespread paranoia of machine learning.

Predicting Crime. A survey of two hundred police agencies across the United States found seventy percent of agencies “planned to implement or increase [the] use of predictive policing technology in the next two to five years¹.” Currently, the New York, Los Angeles, and Chicago Police Departments utilize predictive policing methods to most efficiently allocate their officers and maximize their effectiveness. The problem? Minority neighborhoods are almost always where officers are sent by instructions from these predictive crime algorithms, leading to over-policing. The developers of the most widely used software, PredPol, have refuted the numerous accusations of encoding biases into their algorithms², and publicly disclosed the three, and only three, factors they take into consideration: location, time, and crime type³.

Despite their seemingly unbiased approach to predictive policing, many believe such technologies only further the inequality in our society. Algorithms that disadvantage minorities are quickly met with backlash, regardless of the data’s objectivity. Bashing PredPol for their algorithm’s tendency to over-predict the likelihood of crime occurring in impoverished and historically crime-filled communities is futile; attention should be focused on changing society as a whole, taking measures to reduce criminality. After all, PredPol simply uses past crime to predict future crimes; stopping crime today will lead to the wanted decrease in the over-policing in minority neighborhoods in the near future.

Predicting One’s Threat to Society. When determining the sentence length for a convicted criminal, an increasing number of states are offering judges a computer generated “risk assessment score” to predict the criminal’s likelihood of reoffending.

Nearly three years ago, ProPublica’s released a bombshell exposé on Northpointe’s Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) software, which is the leading algorithm used to evaluate defendants risk score⁴. Despite Northpointe’s intentions of providing a helpful service to court judges to assist during sentencing, ProPublica found COMPAS had a strong racial bias against black individuals:

“We ran a statistical test that isolated the effect of race from criminal history and recidivism, as well as from defendants’ age and gender. Black defendants were still 77 percent more likely to be pegged as at higher risk of committing a future violent crime and 45 percent more likely to be predicted to commit a future crime of any kind.”

Unlike PredPol, Northpointe takes a much wider variety of factors into consideration in their algorithm: income, education, family criminality, substance abuse, and personality traits, resulting in a total of 137 features⁵ used to determine one’s risk score. While the increased amount of variables may allow for a more personalized result, there is a second unintended consequence: the increased ability to reinforce racial, gender, and socioeconomic stereotypes. More data allows for more differences to be found along these lines, and these differences are only be perpetuated into the future. So, should we reinforce these statistical correlations? Should a person be labeled a high risk simply because they fit a certain stereotype found within the data? This is where the controversy arises; promotion of stereotypes generates inequality.

Despite all the evidence suggesting COMPAS is the perfect breeding ground for prejudices to fester, and with the disparate racial impact further highlighted by ProPublica, the algorithm is proprietary software, not subject to state or federal laws. In effect, this eliminates a defendant’s ability to dispute the scientific accuracy of the risk assessment score assigned to them without substantial proof the results are inaccurate.

Ethics advocates, such as the American Civil Liberties Union, have demanded risk assessment scores not be used in the court system, but they are losing their battle⁶. Most notably, the Supreme Court refused to hear an appeal regarding the legality of the risk assessment scores in the 2017 petition Loomis vs. Wisconsin⁷. Such unwillingness to recognize the problem at hand will inevitably lead to more disastrous consequences; algorithmic bias can be likened to a sleeping giant: once it wakes, preventing damage will be exponentially more challenging. The most pertinent question we, and regulators, need to ask ourselves today is: should machine learning algorithms even be used in applications where current day biases already exist?

To answer this question, we must bear in mind no societal improvements are made through automation; stagnation is the only possible outcome.

PredPol and Northpointe have garnered widespread negative media attention–at least in the small world of the ethics in artificial intelligence–further painting machine learning as a dangerous, potentially-totalitarian technology that has an overall negative effect on society. This is all happening while Hollywood continues their profit-scheming fear-mongering movie-productions, developing storylines around AI turned evil. This distracts from the promise of machine learning algorithms, its ability to drastically improve our lives, increase economic output, detect cancer, eliminate repetitive tasks, increasing our quality of life. However, this is an important insight to be learned here: we must not quantify all aspects of our lives. PredPol and Northpointe illustrate that people feel the need to quantify reality through statistical analysis. And in the case of crime prevention and criminal justice, this task is misguided and eliminates the much-needed sympathy within our world.

________________________________________________

1 Huet, Ellen. “Server And Protect: Predictive Policing Firm PredPol Promises To Map Crime Before It Happens.” Forbes, Forbes Magazine, 11 Feb. 2015, www.forbes.com/sites/ellenhuet/2015/02/11/predpol-predictive-policing/#1687fbb44f9b.

2 Chang, Cindy. “LAPD Officials Defend Predictive Policing as Activists Call for Its End.” Los Angeles Times, Los Angeles Times, 24 July 2018, www.latimes.com/local/lanow/la-me-lapd-data-policing-20180724-story.html.

3 “How PredPol Works | Predictive Policing.” PredPol, 2018, www.predpol.com/how-predictive-policing-works/.

4 Angwin, Julia, et al. “Machine Bias.” ProPublica, ProPublica, 23 May 2016, www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.

5 Angwin, Julia. “Sample COMPAS Risk Assessment COMPAS ‘CORE.’” DocumentCloud, Northpointe, 2013, www.documentcloud.org/documents/2702103-Sample-Risk-Assessment-COMPAS-CORE.html.

6 Stanley, Jay. “Eight Problems With Police ‘Threat Scores.’” ACLU, American Civil Liberties Union, 13 Jan. 2016, www.aclu.org/blog/privacy-technology/surveillance-technologies/eight-problems-police-threat-scores.

7 “Loomis v. Wisconsin.” SCOTUSblog, SCOTUS, 2017, www.scotusblog.com/case-files/cases/loomis-v-wisconsin/.

Source: Artificial Intelligence on Medium

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top
a

Display your work in a bold & confident manner. Sometimes it’s easy for your creativity to stand out from the crowd.

Social