2019-2020 Emerging Fellow Anthony Potts outlines general guidelines to minimize biases and increase the fairness of algorithmic-based risk assessment tools for justice systems considering implementing them.


Algorithmic-based risk assessment tools that help judges determine risk of recidivism and set bail have increasingly become a solution to the bloated, inefficient, and unfair pretrial system. Despite their potential advantages, they perpetuate the existing biases that have historically denied many marginalized people a fair opportunity to receive justice. In order to minimize the bias that will inevitably result and increase the fairness of risk assessment tool use, justice systems considering implementing these tools should follow three general guidelines. First, the structure of risk assessment tools should be publicized. Second, risk assessment tools should only be used as an aid for judge decision making. And third, they should be used only to assess high-level offenses.