The Epistemic Injustice of Algorithmic Family Policing
Skip to main content
eScholarship
Open Access Publications from the University of California

UC Irvine Law Review

UC Irvine

The Epistemic Injustice of Algorithmic Family Policing

Abstract

The child welfare system is the system through which U.S. state authorities identify and intervene in families seen as posing a risk of abuse or neglect to their children. Impacted families, advocates, and scholars have joined in a growing chorus in recent years, demonstrating how this system—which many now refer to as the “family policing” system—destroys families and communities as opposed to supporting them. Many now call for the system’s abolition, arguing that the system, while masquerading as one of care and benevolence, is in fact an integral part of the carceral web constituted by criminal policing, prisons, jails, and other punitive and oppressive institutions. Far from being a system designed to support families, it instead is a system of subordination and control.

While this movement has been growing, the family policing system, like its criminal counterpart, has been turning to risk-prediction algorithms to help it with its work. In prior scholarship, I documented the development of these predictive tools and highlighted a number of preliminary associated risks. This piece brings a new lens to the issue, arguing that a key mechanism by which the family policing system accomplishes its subordinating design is through the regulation of knowledge production and sharing. The system selectively and systematically discredits the knowledge of the parents it targets. Borrowing a concept from political philosophy, this piece identifies this harm as that of “epistemic injustice”: the distinct form of injustice that occurs when a person or group is harmed in its capacity as a holder of knowledge. Through perpetrating epistemic injustice, the system acts to maintain the social order. As the system turns to algorithms to rank and categorize its targets, it reinforces old ways of doing business and creates new mechanisms by which to assign and police epistemic worth.

This piece explores the ways that family policing’s turn to “big data” risk-prediction algorithms scales up and expands the system’s already pervasive epistemic injustice.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View