In court sentencing, beware data-driven 'risk' tools

A trend in US courts and elsewhere to use analytical 'risk assessment' tools in determining a person's future dangerousness undercuts the notion of individual agency in choosing a moral and lawful life.

The front of the US Supreme Court building in Washington DC, with the words 'equal justice under law' carved in the stone.

A strange experience for Internet users is those ads that often pop up and seem tailored to the user. The selection of the ad is based on data from prior Web activity collected by digital giants like Google. Yet the ads are sometimes just plain inaccurate about what a user wants. And there’s a lesson in that for a new trend in the court system that also relies on big data to mete out justice.

Courts in more than 20 states in the United States as well as in many other countries have begun to use analytic tools to determine the probability of a criminal being dangerous in the future. Congress is also weighing various bills to use computer-driven “risk assessment” in federal court sentencing, beyond the normal assessments by judges and clinical experts.

These actuarial tools promise to reduce the number of prisoners – which number more than 2 million in the US – by ensuring “high-risk” criminals receive lengthy sentences and “low-risk” ones are given either short sentences or rehabilitation and treatment.

While these goals are worthy, the tools themselves raise troubling issues. For sure, data about a person’s past criminal behavior is useful in setting a sentence. But the new tools also try to calculate a person’s risk of recidivism based on a host of other factors outside his or her control, such as family upbringing, gender, neighborhood, or even race or ethnicity.

In other words, an individual is no longer an individual but is also viewed as a subset of a group.

The result of such data crunching, as US Attorney General Eric Holder warned last year, can cause disparities in sentencing, such as putting more blacks or Hispanics behind bars for longer periods. The accuracy of such data-driven risk assessment also remains in doubt. And the tools violate the constitutional idea of justice being applied to only an individual for his or her criminal conduct and not pertaining to a particular class of people.

As the Supreme Court has said, guilt is personal while guilt by association is unjust.

But the worst aspect of applying such broad data about a convict based on computer-generated analysis of social aspects is that it invalidates the idea of an individual being capable of making the right choices. A 1952 Supreme Court decision described such capability this way: “It is as universal and persistent in mature systems of law as belief in freedom of the human will and a consequent ability and duty of the normal individual to choose between good and evil.”

Judges have long used the length of a sentence as an incentive for an offender to learn to live within the law. But as law professor Dawinder Sidhu of the University of New Mexico wrote in a 2014 law article, “The incentive is meaningless if it is applied to a factor that the individual has no actual or meaningful ability to change.”

Individuals must retain the ability to reject predictions about their future and embrace moral and lawful behavior.

“The individual may adopt a sense of self that is materially different than that which he or she had at the commission of the crime or at sentencing,” wrote Mr. Sidhu, who has also worked with the US Sentencing Commission. “We should want individuals to beat the odds. It should be our preference for individuals to be productive and our goal to help facilitate the development of the individual such that he or she can be part of mainstream society upon release.”

These tools claim to be “evidence based.” But both in their possible effects of unequal justice and in violating a principle of individual moral agency in the justice system, they should remain suspect. Just as a Google pop-up ad can miss its algorithmic mark, so can big data about an offender’s ability to choose a better life.

[Editor's note: An earlier version of this editorial gave an incorrect number for those incarcerated in the US.]

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to