Facebook uproar: Should personal data drive political ads?

Cambridge Analytica used Facebook data on 50 million Americans for the Trump campaign. The revelation offers a glimpse into how 'big data' is eroding privacy and reshaping politics.

|
Dado Ruvic/Photo illustration/Reuters/File
A photo illustration depicts a Facebook logo mirrored in a human eye. The company's data on consumers has big value to corporations, and has also become sought-after by political campaigns seeking insights on how to sway voters.

In 2014, when Thom Tillis was a North Carolina legislator seeking election to the US Senate, some of his campaign ads were customized using what was then an unconventional technique.

Drawing on information volunteered by Facebook users who took online psychological quizzes, campaign consultants formed personality profiles of individual voters, and designed ads accordingly.

For people who scored high on being “agreeable,” an ad featured Mr. Tillis himself, smiling, with the message “restore common sense in Washington.” For personalities deemed “conscientious,” ads sent online and by mail featured people at a job site, including one in a hard hat, and said the Republican candidate had “the experience to get the economy working.” A third showed what appeared to be a soldier’s camouflage-smeared face, and said of Tillis: “Your safety is his top priority.”

That last one was aimed at those who score high on “neuroticism,” or negative emotions.

Tillis’s successful effort to sway voters, reported by the MIT Technology Review in April 2016, helped set the stage for more ambitious work by the data-crunching firm in the 2016 presidential race.

Now that firm, Cambridge Analytica, is the focus of controversy for its role in the 2016 Trump campaign, notably the way it relied on unauthorized access to personal Facebook data on some 50 million Americans. Cambridge Analytica has suspended its CEO, while Facebook, which has cut ties with the firm,  now faces heightened scrutiny.

But beyond the legal questions surrounding the Facebook data is a deeper story: These efforts to mobilize some voters and dissuade others offer a glimpse of how the marketing of political candidates is growing ever more psychologically targeted. And where Cambridge Analytica’s CEO has touted the change as an inevitable computer-driven enhancement of communication, some experts say it carries risks for the health of democracy.

“We've never had a system of mass micro persuasion” until fairly recently, let alone deployed it to influence the political process, says Jeff Chester, executive director of the Center for Digital Democracy in Washington. “Politics and life of our democracy is not the same as selling soap and fast foods.” Companies like Facebook, he says, “have clearly lost their moral compass in pursuit of astronomical revenues.”

It’s a debate that appears sure to grow, fueled in part by political parties’ long history of attempting to use data on individual voters to sway elections.

What’s new is the rising sophistication of the efforts – as the volume of data on individuals soars alongside the ability of software to sift and make use of that data. That doesn’t mean these data-crunching techniques were necessarily pivotal in either the Tillis race or the 2016 presidential election.

Calls for transparency 

But to some analysts, including Mr. Chester, the increasing use of personalized data is troubling as it grows more advanced in seeking to influence voters – and does so in ways that voters may not understand. He suggests that America needs a federal law to protect online privacy, in effect seeking to level a playing field that’s now stacked toward corporate data-gathering. (Europe has such a law, which will go into effect in May.) And he suggests that companies need to do more to be transparent about their data handling and ramp up their own policies regarding its ethical use.

In a December report, co-written with Kathryn Montgomery of American University, Chester writes that some of the techniques “raise serious concerns – over privacy, discrimination, manipulation, and lack of transparency.”

In the case of Cambridge Analytica, the firm has claimed that its trove of data – such as Facebook users’ “likes” on particular websites or online posts – allows it to help campaigns target messages to individual voters.

The firm got hold of the Facebook data in a roundabout way, acquiring it from an academic researcher who, in turn, had gotten access to the data alongside responses to a personality-modeling questionnaire. In effect, when 270,000 people took the test, the process also opened up access to data on their millions of Facebook friends.

Facebook learned in 2015 of the unauthorized access by Cambridge Analytica, a spinoff of Strategic Communication Laboratories, or SCL. Facebook asked that the firm promise to delete the data, but it did not ensure that this occurred. Nor did Facebook notify the public of the breach of its policies. Although Facebook took some steps in 2014 to restrict access to its data via software apps, its business model continues to rely on revenue from firms (and others like political campaigns) that target ads using consumer data.

The exception or the rule?

In short, although Cambridge Analytica is catching fire for how it got Facebook data, many other firms get the same or similar data in ways that are perfectly legal.

Consider LiveRamp, a subsidiary of the data broker Acxiom, which promises clients: “Tie all of your marketing data back to real people, resolving identity across first-, second-, or third-party digital and offline data silos.”

In the marketing industry, many say this is simply the new normal of the digital era. Consumers know they’re trading in much of their privacy in return for getting to use websites like Facebook or other largely free digital tools, says Michael Priem, founder and CEO of ModernImpact, an advertising firm in Minneapolis. And they largely accept that consumer ads are being targeted at them based on that data, just as they accept that when they walk into a retail store, there’s a surveillance camera watching them.

“That's not scary,” he says. “What's scary is when consumers don't know: ‘Am I being watched?’ ”

Companies have self-interested reasons to develop standards that are acceptable to consumers, Mr. Priem says. But some consumer advocates say the industry needs more regulation alongside such self-developed standards.

Privacy groups have asked the Federal Trade Commission to investigate whether, due to the Cambridge Analytica data use, Facebook violated an FTC consent order regarding data privacy on the social network. There are calls for Facebook CEO Mark Zuckerberg to testify before Congress. The company is denying reports its chief of data security is resigning. On Wednesday Mr. Zuckerberg said the company is taking steps to audit for suspicious activity apps that access its data. And he said developers of software (like the personality-quiz app) would see their data access restricted further.

Possible sea change

The flurry of concerns suggests this has the potential to become a turning point in public thought on a long-simmering issue.

It’s not new, after all, for political campaigns to blend social-media savvy into their strategies.

“In 2012, both the Romney and Obama  campaigns used Facebook apps to pull data about people,” notes Shannon McGregor, a communications professor at the University of Utah in Salt Lake City who studies social media and politics. In 2016, her research showed that virtually every presidential campaign used social media to gain insights into voters.

Also, the latest revelations come alongside a year’s worth of evidence that Russia has used digital tactics to meddle in US and other recent elections.

Nor is it clear that the new data methods are actually very effective at swaying voters.

While Cambridge Analytica has touted its successes – such as in helping Sen. Ted Cruz (R) of Texas gain traction in the Iowa caucuses as a challenger to Trump before the firm was engaged by the Trump campaign – “we have really good reason to be skeptical that anything that Cambridge Analytica engaged in was necessarily more effective than other standard forms of voter targeting and strategic appeals,” says Daniel Kreiss, a communications expert at the University of North Carolina in Chapel Hill.

Still, he says there’s a “stunning” lack of transparency and accountability at companies including Facebook and Google over how their data are being used in politics.

Few guidelines

Among the ethical gray areas: As political communication becomes more customized, does that amplify the polarization of the electorate?

Professor Kreiss says Facebook’s business model “makes it really easy to speak to people who are aligned with your camp” on any particular issue. And the prevalent clickbait model online communication encourages “content that engages people in emotional ways.”

Regulators haven’t kept pace with the innovations around political use of social media, whether regarding access to the data or the kind of messaging carried by it. “There is very little regulation around this type of advertising, which accounts for billions and billions of dollars of advertising,” Professor McGregor says.

What’s needed is a change in mindset, says McGregor. “For these companies, it’s clear that they can make a lot of money from accepting political ads and being able to target in this case voters in specific ways.” But, she says, “the implications are much more far-reaching in terms of the integrity of our elections.”

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Facebook uproar: Should personal data drive political ads?
Read this article in
https://www.csmonitor.com/Business/2018/0321/Facebook-uproar-Should-personal-data-drive-political-ads
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe