Google and advertisers may be discriminating against users, says a new study by researchers at Carnegie Mellon University and the International Computer Science Institute. The group found that male profiles receive ads for high-paying jobs at a rate about six times higher than female profiles do, even when all other aspects of the profiles – search history, online behavior – were kept equal.
The scientists who led the study set out to investigate concerns over Web-tracking and ad-privacy settings by creating a tool they called AdFisher. The program creates multiple accounts to simulate user behavior and records the advertising results. It then calculates whether these results are statistically significant. By using this tool in coordination with Google’s Ads Settings, the team was able to test for transparency and discrimination among online ads.
“This is concerning from a societal standpoint,” says Anupam Datta, associate professor at Carnegie Mellon and a co-author of the study, in an interview with MIT Technology Review. Google and its targeted advertising are so pervasive that their effects on content can have a serious impact on people’s lives and decision making, he says.
This isn’t the first time Google’s search algorithms have come under fire for discriminatory results. A study published in April found gender bias in Google image searches, displaying significantly fewer images of women in the results for “CEO” or other executive positions. But that study looked at the results given, not at the gender of the user account submitting the query. Google has also admitted to a lack of workplace diversity within the company.
In addition to gender discrepancies, the team’s experiment found that visiting websites about substance abuse displayed ads for rehab centers and clinics, even after the team changed a simulated users’ ads settings. The study says though that this may simply be due to “remarketing,” or displaying ads of sites recently visited, not because of Google’s user-targeted advertising algorithms, but the team could not be sure. This highlights an opacity within Google’s ads algorithm that the report noted could be problematic. This lack of transparency would affect those on shared or public computers and it may also violate Google’s advertising terms on collecting sensitive information “such as health information.”
Since the study was published in March, Google has added a disclaimer to its user ads settings stating that users can only control “some” of the ads that they see.
The team says it’s unlikely that Google intentionally discriminated against users or obfuscated its policy. “We consider it more likely that Google has lost control over its massive, automated advertising system,” they say in the study.
The question of attribution here is a difficult one. With so many variables, the researchers are unsure of whether this falls at a fault of Google, the advertisers, both, or neither. Indeed, machine-learned algorithms can behave oddly under lots of data, and systems programmed by people may hold the biases of people.
“Given the pervasive structural nature of gender discrimination in society at large, blaming one party may ignore context and correlations that make avoiding such discrimination difficult,” writes Dr. Datta. “Furthermore, we cannot determine whether Google, the advertiser, or complex interactions among them and others caused the discrimination.”
Whatever complex factors led to the study’s findings still need to be researched further, Roxana Geambasu, an assistant professor at Columbia University, tells Technology Review.
“You can’t draw big conclusions, because we haven’t studied this very much and these examples could be rare exceptions,” she says. “What we need now is infrastructure and tools to study these systems at much larger scale.”
Though their research found discrimination and a lack of transparency in Google’s advertising operations, Datta and his team cannot say whether this breaks any of the company’s rules. “Indeed, Google’s policies allow it to serve different ads based on gender,” the study notes.
But AdFisher was not investigating how the ads were discriminating, just if they were. And they found that result with a statistical significance “1000 times more significant than the standard 0.05 significance.”
They hope that the tools they’ve created can be used in the future to figure out how this discrimination was caused so that it can be corrected. This information, the report notes, can be easily used by Google, the advertisers, or a regulatory body to spur deeper investigations.
“We encourage research developing tools that ad networks and advertisers can use to prevent such unacceptable outcomes,” says the report.
At the time of this article, Google has been investigating the study and has yet to release a statement.