Modern field guide to security and privacy

Opinion: The ugliest side of facial recognition technology

The emergence of technologies that falsely promise to predict someone's behavior based on their facial features and expressions is a deeply troubling development.

|
Toby Melville/Reuters/File
A surveillance camera in front of a poster in London.

It's no mystery that big data presents a challenge to privacy. But perhaps more alarming is the emergence of technology that combines facial recognition and data analytics to create a powerful surveillance tool.

It's a disturbing development that combines the most worrisome aspects of algorithmic and big data technology with the chilling and dangerous threats inherent in facial recognition.

Chicago tech company is advertising its "predictive video" to anticipate behavior "based on the emotional state and personality style of any person in a video." In Russia, the app FindFace gives users "the power to identify total strangers on the street," according to The Atlantic.

It's not just the tech fringe, either. Google's new chat app Allo has a "smart reply" feature that apparently analyzes photos from contacts and offers​ suggested​ responses to them​.

But most troubling is the Israeli startup Faception. It offers a product that combines machine learning with facial recognition to "identify everything from great poker players to extroverts, pedophiles, geniuses, and white collar criminals." A Department of Homeland Security contractor has hired the firm to "help identify terrorists."

That's a problem. The government should not use people's faces as a way of tagging them with life-altering labels. The technology isn't even accurate. Faception’s own estimate for certain traits is a 20 percent error rate. Even if those optimistic numbers hold, that means for every 100 people, the best-case scenario is that 20 get wrongly branded as a terrorist.

And yet, according to the company, powerful profiling is possible due to two alleged facts: personalities are "affected by genes" and our faces are a "reflection of our DNA."

The first premise doesn’t inspire confidence. It presumes nature affects our personalities more than nurture – a conclusion that experts constantly debate. For the sake of argument, though, let’s say this is true. Even then, we’re not dealing with a robust causal claim. Saying that personalities are "affected" by genes is a much weaker assertion than maintaining our genes determine them.

As to the face being a reflection of DNA, the folks at Faception admit that the evidence that supports that claim comes from animal studies, not psychological inquiry into human beings. Conveniently, they dismiss differences by accepting an unnamed researcher's conjectural claim that "the human face was likely to develop in the same way."

These assumptions completely ignore many established psychological theories such as "situationism," which is when environmental features dispose all of us to behave in new ways – ways that can lead a person who habitually does good things to commit evil and atrocious acts.

Physical images certainly have some revelatory power. A snapshot of body language, for instance, can reveal confidence or nervousness. But it would be serious mistake to view the face alone as a portal into deep character traits and future behavior.

Advocates of this kind of data analysis might argue that algorithms will get better as the data science advances and computers can make more decisions more quickly. But false correlations are already plaguing big data. Economist Ronald Coase famously told us years ago that if you torture the data long enough, it will confess anything.

But unfortunately people tend to place far too much confidence in anything a computer spits out. This phenomenon known as "automation bias" looms large with the implementation of predictive facial recognition technologies. A prime example of this was documented in a recent Pro Publica investigation that exposed racial bias embedded in predictive criminal risk assessments software.

Even if technology can guess correctly, do we really want to live in a society in which machines try to suss out deep truths based on our facial features? If that were the case, our "faceprints" would serve as beacons for unwanted attention, threatening our obscurity – the idea that when information about us is hard to find, it's safer. And you can't ditch the beacon unless you want to wear a mask in public for the rest of your life.

Our faces are indeed exceptional. But predictive facial recognition technology and companies like Faception exacerbate the most dangerous aspects of both big data and facial recognition.

Evan Selinger is a professor of philosophy at Rochester Institute of Technology. Follow him on Twitter @EvanSelinger. 

Woodrow Hartzog is ​the ​​W. Stancil Starnes Professor of Law ​at Samford University’s Cumberland School of Law. Follow him on Twitter @Hartzog.

 

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Opinion: The ugliest side of facial recognition technology
Read this article in
https://www.csmonitor.com/World/Passcode/Passcode-Voices/2016/0527/Opinion-The-ugliest-side-of-facial-recognition-technology
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe