Artificial intelligence plays budding role in courtroom bail decisions

Computer algorithms are now helping decide the near-term future for defendants in city and state courtrooms around the country. Cash bail has come under fire for exacerbating racial disparities and some see computer algorithms as the solution.

|
Dake Kang/AP/File
Judge Jimmy Jackson Jr. speaks on the first day that risk-assessment software is used at the Cleveland Municipal Court on Aug. 30, 2017. The technology seeks to correct for implicit bias in courtroom bail decisions.

The centuries-old process of releasing defendants on bail, long the province of judicial discretion, is getting a major assist ... courtesy of artificial intelligence.

In late August, Hercules Shepherd Jr. walked up to the stand in a Cleveland courtroom, dressed in an orange jumpsuit. Two nights earlier, an officer had arrested him at a traffic stop with a small bag of cocaine, and he was about to be arraigned.

Not long ago, the presiding judge would have decided Mr. Shepherd's near-term future based on a reading of court files and his own intuition. But in Cleveland and a growing number of other local and state courts, judges are now guided by computer algorithms before ruling whether criminal defendants can return to everyday life, or have to stay locked up awaiting trial.

Cash bail, which is designed to ensure that people charged of crimes turn up for trial, has been part of the United States court system for centuries. But it has drawn fire in recent years for keeping poorer defendants in jail while letting the wealthier go free. Studies have also shown it widens racial disparities in pretrial incarceration.

A bipartisan bail reform movement has found an alternative to cash bail: AI algorithms that can scour through large sets of courthouse data to search for associations and predict which people are most likely to flee or commit another crime.

Experts say the use of these risk assessments may be the biggest shift in courtroom decisionmaking since American judges began accepting social science and other expert evidence more than a century ago. Christopher Griffin, a research director at Harvard Law School's Access to Justice Lab, calls the new digital tools "the next step in that revolution."

Critics, however, worry that such algorithms could end up supplanting judges' own judgment, and might even perpetuate biases in ostensibly neutral form.

States such as New Jersey, Arizona, Kentucky, and Alaska have adopted these tools. Defendants who receive low scores are recommended for release under court supervision.

Among other things, such algorithms aim to reduce biased rulings that could be influenced by a defendant's race, gender, or clothing – or maybe just how cranky a judge might be feeling after missing breakfast.

The AI system used in New Jersey, developed by the Houston-based Laura and John Arnold Foundation, uses nine risk factors to evaluate a defendant, including age and past criminal convictions. But it excludes race, gender, employment history, and where a person lives.

It also excludes a history of arrests, which can stack up against people more likely to encounter police – even if they're not found to have done anything wrong.

Other efforts to automate judicial decisions have come under fire – in particular, a proprietary commercial system called Compas that's been used to help determine prison sentences for convicted criminals. An investigative report by ProPublica found that Compas was falsely flagging black defendants as likely future criminals almost twice as frequently as white defendants.

Other experts have questioned those findings, and the US Supreme Court last year declined to take up a case of a Wisconsin man who argued the use of gender as a factor in the Compas assessment violated his rights.

The Arnold Foundation notes that its algorithm is straightforward and open to inspection by anyone – although the underlying data it relies on is not.

Advocates of the AI approach argue that the people in robes are still in charge. "This is not something where you put in a ticket, push a button, and it tells you what bail to give somebody," said Judge Ronald Adrine, who presides over the Cleveland Municipal Court. The algorithmic score is just one of several factors for judges to consider, he says.

But others worry the algorithms will make judging more rote over time. Research has shown that people tend to follow specific advisory guidelines in lieu of their own judgment, said Bernard Harcourt, a law and political science professor at Columbia.

"It's naive to think people are simply going to not rely on them," he said.

Those issues played out before Judge Jimmy Jackson Jr. in that Cleveland courtroom last summer. Before his arrest on Aug. 29, Hercules Shepherd had no criminal record.

College coaches were pursuing the star high school basketball player; recruitment would mean a big scholarship that could help Shepherd realize his dreams of becoming an engineer. But by sitting in jail, Shepherd was missing two days of classes. Missing two more could get him kicked out of school.

The judge looked down at a computer-generated score on the 18-year-old's case file. Two out of six for likelihood of committing another crime. One out of six for likelihood of skipping court. The scores marked Shepherd as a prime candidate for pretrial release with low bail.

"Mr. Shepherd? I'm giving you personal bond," Mr. Jackson said. "Your opportunity to turn that around starts right now." (Jackson subsequently lost an election in November and is no longer a judge; his winning opponent, however, also supports use of the pretrial algorithm.)

Smiling, Shepherd walked out of the courtroom. That night, he was led out of the Cuyahoga County Jail; the next day, he was in class. Shepherd says he wouldn't have been able to afford bail. If he isn't arrested again within a year, his record will be wiped clean.

This story was reported by The Associated Press.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Artificial intelligence plays budding role in courtroom bail decisions
Read this article in
https://www.csmonitor.com/USA/Justice/2018/0131/Artificial-intelligence-plays-budding-role-in-courtroom-bail-decisions
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe