Modern field guide to security and privacy

Report finds racial bias in facial recognition technology

More than 40 rights groups asked the Department of Justice to launch a probe examining whether systems used by police to investigate crimes disproportionately identify blacks as criminal suspects.

Chip East/Reuters
Officials use a facial recognition program to scan visitors at the Statue of Liberty in New York City.

US law enforcement agencies store images of 117 million adults as part of facial recognition programs that have become critical tools in modern police work. 

But a report released on Tuesday raises serious questions about racial bias built into these systems designed to identify suspects, saying the technology disproportionately singles out blacks in criminal investigations. 

The year-long study from Georgetown Law's Center on Privacy and Technology charted the rapid increase in facial recognition programs at 52 police agencies nationwide. The programs contain mug shots, images from driver's licenses, and other pictures cataloged in systems created without legislative approval and operated without legal oversight, according to the study.

"These algorithms don't see race in the same way you and I do, but that doesn't mean they're not racist," said Jonathan Frankle, a PhD student at the Massachusetts Institute of Technology who worked the study as a technologist with the Center on Privacy and Technology.

"When you have darker skin, there's less information in the photo because your skin reflects light differently. So having darker skin means it's harder to differentiate faces, which is bad because if you're going to use this as a policing tool you want this to be as accurate as possible."

According to the study, facial recognition systems are 5 to 10 percent less accurate when trying to identify blacks than when analyzing the facial images of white adults in the system.

In conjunction with the Georgetown report, more than 40 civil liberties organizations, including the American Civil Liberties Union and the Leadership Conference for Civil and Human Rights, asked the Justice Department to examine whether facial recognition systems contribute to racial bias in policing. 

"A growing body of evidence suggested that law enforcement use of face recognition technology is having a disparate impact on communities of color, potentially exacerbating and entrenching existing policing disparities," the groups said in a letter to the Justice Department.

As public surveillance become commonplace in cities across the country, police departments are able to use databases of facial images to identify potential suspects who may have been caught on video cameras at stores, city streets, or within public transportation, for instance.

"This is a fundamental change in policing where everything you do in public is trackable not through your technology, but through your body," said Alvaro Bedoya, executive director of the Center for Privacy and Technology, during a press conference Tuesday. "And this technology is not limited to serious criminals. It's not limited at all, really."

The report's authors recommend that police departments take several steps to limit the potential for racial bias in facial recognition programs affecting investigations. Police should obtain a court order to conduct mass searches of facial images, they said. And the researchers recommended that internal audits review the use of facial recognition programs.

Society has decided to adopt "this technology first, and decided to ask questions later," said Clare Garvey, associate researcher at Georgetown's Center on Privacy and Technology. "That approach is fundamentally flawed."

The Georgetown report comes one week after the ACLU revealed that police in Baltimore and Ferguson, Mo., monitored protests in real-time Facebook, Twitter, and Instagram posts. Baltimore police also used the state of Maryland's so-called Image Repository System, which accesses police mug shots and driver's license photos, to surveil protests following the death of Freddie Gray, who died in police custody last year.

"We have to ask ourselves, why was facial recognition used at this protest, and will facial recognition chill free speech," Ms. Garvey said. "If so, the damage from this technology extends to the community writ large if the people don't have a safe space to express themselves."

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Report finds racial bias in facial recognition technology
Read this article in
https://www.csmonitor.com/World/Passcode/2016/1018/Report-finds-racial-bias-in-facial-recognition-technology
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe