Sections

Commentary

Algorithms and sentencing: What does due process require?

An empty courtroom in New York City

There are significant potential benefits to using data-driven risk assessments in criminal sentencing. For example, risk assessments have rightly been endorsed as a mechanism to enable courts to reduce or waive prison sentences for offenders who are very unlikely to reoffend. Multiple states have recently enacted laws requiring the use of risk assessment instruments. And in 2017, the American Law Institute, a highly respected organization that has worked for many decades to “clarify, modernize, and otherwise improve the law,” approved a proposed final draft of the “Model Penal Code: Sentencing.” The document specifically recognizes the value of evidence-based sentencing with input from actuarial instruments that “estimate the relative risks that individual offenders pose to public safety through their future criminal conduct.”

However, along with benefits, the growing use of algorithm-based risk assessment tools is raising important due process concerns. Due process is a core constitutional right provided through both the Fifth and Fourteenth Amendments, both of which protect people from being deprived of “life, liberty, or property, without due process of law.” A key subcategory of due process is procedural due process, which aims to ensure fairness in legal proceedings that place life, liberty, or property at risk.

When algorithm-based risk assessment tools are used in criminal proceedings, due process issues can arise with respect to offenders’ rights to challenge the accuracy and relevance of information used at sentencing. We highlight two of those challenges. The first relates to an offender’s right to information regarding the algorithm used to compute risk scores, and the second relates to an offender’s right to know what those scores are.

Concerns With Proprietary Risk Assessments

In May 2013, Eric Loomis pled guilty in the Circuit Court for La Crosse County, Wisconsin to the charges of attempting to flee a traffic officer and operating a motor vehicle without the owner’s consent. In advance of Loomis’ August 2013 sentencing hearing, data about him was entered into a risk assessment tool known as COMPAS (Correctional Offender Management Profiling for Alternative Sanction). While the COMPAS algorithm used to produce risk scores is proprietary, the output is not. Loomis’ COMPAS report indicated a high risk of recidivism.

In the August 2013 hearing, Loomis was sentenced to a multiyear prison term. Referencing COMPAS in his decision, the judge said:

“You’re identified, through the COMPAS assessment, as an individual who is at high risk to the community. In terms of weighing the various factors, I’m ruling out probation because of the seriousness of the crime and because your history, your history on supervision, and the risk assessment tools that have been utilized, suggest that you’re extremely high risk to re-offend.”

After unsuccessfully seeking post-conviction relief at the county court, Loomis appealed to the Wisconsin Supreme Court, arguing that “reliance on COMPAS is a violation of” his “due process rights because the proprietary nature of COMPAS prevents a defendant from challenging the scientific validity of the assessment.”

In 2016, the Wisconsin Supreme Court ruled against Loomis, finding that “if used properly with an awareness of the limitations and cautions . . . consideration of a COMPAS risk assessment at sentencing does not violate a defendant’s right to due process.” And, the court wrote, while the judge at the sentencing hearing had “mentioned the COMPAS risk assessment, it was not determinative in deciding whether Loomis should be incarcerated, the severity of the sentence, or whether he could be supervised safely and effectively in the community.” Loomis then appealed to the U.S. Supreme Court, which in June 2017 declined to hear his case.

When a proprietary algorithm is used to produce a risk assessment, the due process question should turn not on whether the risk assessment was—to use the term in the Wisconsin Supreme Court ruling—“determinative,” but on whether and how it was used. For Loomis, the COMPAS output was purportedly used only to “reinforce” the “assessment of the other factors” considered. The Wisconsin Supreme Court stated that the sentencing court “would have imposed the exact same sentence without it. Accordingly, we determine that the circuit court’s consideration of COMPAS in this case did not violate Loomis’s due process rights.”

This logic leads to a troubling paradox. On the one hand, if the use of a proprietary risk assessment tool at sentencing is only appropriate when the same sentencing decision would be reached without it, this suggests that the risk assessment plays absolutely no role in probation or sentencing decisions. If that is the case, then why use it at all? If, on the other hand, it may have a potential impact—despite the Wisconsin court’s assertion to the contrary—then the due process question can’t be pushed aside.

The Right Of Offenders To Know Their Risk Scores

Another important due process issue concerns what information offenders are given—or not—about their risk scores. In a recent case in Kansas, John Walls pled no contest to a criminal threat charge. He was then evaluated using the LSI-R (Level of Service Inventory-Revised) risk assessment tool. When he asked to see the results, he was only given access to a cover page summarizing his general scores; his request to see the specific questions and answers and the scores associated with those questions was denied.

After he was sentenced by a district court to a highly supervised form of probation generally used for moderate- or higher-risk offenders, he challenged his sentence before the Kansas Court of Appeals, arguing that the refusal to disclose the details of his LSI-R assessment violated his right to due process. The appeals court ruled in favor of Walls, noting that denying Walls access to his complete LSI-R assessment made it impossible for him to “challenge the accuracy of the information” used in “determining the conditions of his probation.” The original sentence was vacated, and the case was remanded for resentencing.

Policy Issues

The U.S. Department of Justice has acknowledged that “a sentencing court’s use of actuarial risk assessments raises novel constitutional questions.” And the questions aren’t only constitutional—there are substantial policy and technology issues as well. As the two examples above illustrate, lack of transparency can arise in relation to both how risk scores are computed and in whether an offender is able to access them. An additional complicating factor is trade secret rights, which companies that make proprietary risk assessment tools will invoke in arguing that the details of their algorithms can’t be disclosed.

Questions that will arise with increasing frequency in the coming years include: What level of detail about a risk assessment algorithm and its output does an offender have the right to access? Are new laws needed to facilitate that access? How should trade secret rights of companies that make risk assessment tools be addressed? And in the future, as artificial intelligence-based risk assessments become common, how will the dynamic nature of AI algorithms further complicate these questions?

As with so many issues at the intersection of law, policy, and technology, there are no easy answers. But a foundational assumption in the dialogue will need to be that the right to due process can’t be collateral damage to the adoption of increasingly sophisticated algorithmic risk assessment technologies.

Authors