Technology and Culture: How Predictive Policing Harmfully Profiles Marginalized People Groups

Authors

  • Taryn Bates

Abstract

American law enforcement departments use predictive policing technology to predetermine crime zones within assigned jurisdictions. Predictive police technology is susceptible to human error, due to both the directives from algorithm programmers and the lack of vetting for bias in data collection processes. A study by Selbst (2017) on potential applications of predictive policing found police data excessively links crime to neighborhoods with high percentages of demographics and up of primarily non-white residents living below poverty income thresholds. Data mining systems work by gathering information from records of human decisions without considering the intent behind the actions; this creates the possibility of generating or worsening discriminatory outcomes. For programs based off of flawed data (2017), erroneous predictions reinforce existing prejudices held by law enforcement. Since past crime data is integrated into predictive policing systems, it is inevitable results will include historic biases within recorded enforcements. This correlation maintains discriminatory policing instead of creating a future of equality to all humankind. If predictive policing technology disproportionately profiles marginalized people groups, then the algorithms are inherently discriminatory while reinforcing implicit bias held by operators or enforcers of results generated by the program. Therefore, predictive police technology algorithms need to be correctly updated to avoid perpetuating harm through biased association of crime with minority cultures.

Published

2024-06-01