Abstract

Recent theoretical writings on the possibility that algorithms would someday be able to create law have delayed algorithmic law-making, and the need to decide on its legitimacy, to some future time in which algorithms would be able to replace human lawmakers. This Article argues that such discussions risk essentializing an anthropomorphic image of the algorithmic lawmaker as a unified decision-maker and divert attention away from algorithmic systems that are already performing functions that together have a profound effect on legal implementation, interpretation, and development. Adding to the rich scholarship of the distortive effects of algorithmic systems, the Article suggests that state-of-the-art algorithms capable of limited legal analysis can have the effect of preventing legal development. Such algorithm-induced ossification, the Article argues, raises questions of legitimacy that are no less consequential than those raised by some futuristic algorithms that can actively create norms.
To demonstrate this point, the Article puts forward a hypothetical example of algorithms performing limited legal analysis to assist healthcare professionals in reporting suspected child maltreatment. Already in use are systems performing risk analysis to aid child protective services in screening maltreatment reports. Drawing on the example of algorithms increasingly used today in social media content moderation, the Article suggests that similar systems could be used for flagging cases that show signs of suspected abuse. Such assistive systems, the Article argues, will likely cement the prevailing legal meaning of maltreatment. As mandated reporters increasingly rely on such systems, the result would be the absence of legal evolution, preventing changes to contentious elements in the legal definition of reportable suspicion, including the scope of acceptable physical disciplining. Together with the familiar effect of existing systems, the effect of this hypothetical system could have a profound effect on the path of the law on child maltreatment, equivalent in its significance to the effect autonomous algorithmic adjudication would have.

Maggen, Daniel, Predict and Suspect: The Emergence of Artificial Legal Meaning (March 1, 2021). North Carolina Journal of Law and Technology, Vol. 23, No. 1, 2021,
0