Over the past several years, increased awareness of racial inequity in policing, combined with increased scrutiny of police technologies, have sparked concerns that new technologies may aggravate inequity in policing. To help address these concerns, some advocates and scholars have proposed requiring police agencies to seek and obtain legislative approval before adopting a new technology, or requiring the completion of “algorithmic impact assessments” to evaluate new tools.
In order for policymakers, police agencies, or scholars to evaluate whether and how particular technologies may aggravate existing inequities, however, the problem must be more clearly defined. Some scholars have explored inequity in depth as it relates to specific police technologies. But to date, none have provided an explanation of how police technology aggravates inequity that can be applied across all technologies—including future technologies we have not yet encountered.
This Article fills that gap. It offers a proposed new taxonomy that parses the ways in which police technology may aggravate inequity as five distinct problems: police technology may (1) replicate inequity in policing, (2) mask inequity in policing, (3) transfer inequity from elsewhere to policing, (4) exacerbate inequitable policing harms, and/or (5) compromise oversight of inequity in policing.
Naming and defining these problems will help police agencies, policymakers, and scholars alike analyze proposed new police technologies through an equity lens and craft policies that respond appropriately. This framework should be built into evaluations of police tools performed in accordance with Community Control Over Police Surveillance (“CCOPS”) ordinances being passed in a growing number of cities. To assist with these practical applications of the taxonomy, this Article also offers a model equity impact assessment for proposed police technologies, and explains why the time is ripe for introduction of such an assessment. Finally, this Article explains how the proposed taxonomy and impact assessment tool can be used to evaluate new technologies through an equity lens in contexts beyond the criminal legal system. As policymakers consider requiring algorithmic impact assessments in other domains, they can draw on the framework provided in this Article for one possible model.
a Associate Professor of Law, Georgetown University Law Center, and Director of the Communications & Technology Law Clinic. The author is grateful to Kevin Bankston, Julie Cohen, Andrew Guthrie Ferguson, Barry Friedman, Shon Hopwood, Shankar Narayan, Paul Ohm, Rashida Richardson, Andrew Selbst, Peter Swire, Harlan Yu, and David Vladeck for their feedback and guidance on this Article, as well as to all those who attended a faculty workshop at Georgetown and provided feedback on a draft. Special thanks also to Alvaro Bedoya and Angela Campbell for invaluable support, and to the staff of the Center on Privacy & Technology for their engagement on many of the topics discussed in this article, as well as their patience as I produced this draft.
The full text of this Article is available to download as a PDF.