Google, Microsoft child pornography omissions only 'a part of the fight'

Tech giants Google and Microsoft have made headlines for cleaning up search results to deter searchers from child pornography sites. However, this decision highlights the often contentious decision to censor online content, and likely is only a small part of the fight against child exploitation.

Google search.

Google and Microsoft took unprecedented steps this week to combat child exploitation, announcing both companies would be omitting more than 100,000 terms from their search algorithms that could potentially lead to child pornography. The two companies will be including a warning message and helpline number on an additional 13,000 terms in hopes of deterring further searching of illicit content.

"While no algorithm is perfect – and Google cannot prevent pedophiles adding new images to the web – these changes have cleaned up the results for over 100,000 queries that might be related to the sexual abuse of kids," writes Google executive chairman Eric Schmidt in a Daily Mail op-ed.

Despite widespread public disdain for child pornography, omitting search terms is a marked step away from the free Internet Mr. Schmidt has preached many times before. However, this isn’t the first time that Google or Microsoft has edited its algorithms to bypass offensive content, and follows a pattern of whack-a-mole solutions to savvy "dark Web" content creators.

In 2009, Google released Google Instant, the now standard search function that predicted what users were likely to search, and offered a results page as they typed their inquiry. However, since words such as ‘assimilate’ will bring up wholly different results whether you just type in the first three letters or all ten, Google set aside a certain set of contentious words that you would have to hit ‘search’ before seeing the results.

Other websites have pushed their "not safe for work" (NSFW) content away from search engine results all together. In July, the notoriously adult-content friendly blogging website Tumblr began editing the searchability of its NSFW and adult-content blogs. The site said that adult blogs, which “contain substantial nudity or mature/adult-oriented content,” would not show up on Google, Yahoo, or Bing searches. This started soon after Yahoo, who’s chief executive said not all Tumblr’s content was “brand safe,” bought the site.

However, after backlash from Tumblr users, later in July the site re-merged adult and NSFW content and restored the sites to search engine indexes.

Aside from pornography, tech companies have had to edit search results and available material depending on geography and policy. China has long had a contentious relationship with Google, requiring terms such as “Tiananmen Square 1989” to be omitted from search terms and eventually pushing Google to move its search engine domain to Hong Kong (which isn’t under Chinese propaganda jurisdiction). In Germany and France, where there are strong laws against religious persecution and hate speech, Google omits pro-Nazi and Holocaust-denial websites (but links to the website Chilling Effects at the bottoms of the search page, which lists censored Internet websites). Ebay decided in May to not allow the sale of any Nazi or hate material, even if it has historical significance.

The most recent changes at Google and Microsoft, however, put the power more in the hands of search engines. Schmidt was clear that changing search terms is far from the only step needed to combat child pornography. In his op-ed, he pointed out that many photos can easily circumvent search terms, and must be looked at individually by people who can judge content.

For example, take people at the National Center for Missing and Exploited Children (NCMEC). John Shehan, executive director of the exploited child division at NCMEC, says they look at each individual image sent to its database,, to determine whether something is pornography or not. They then record the photos hash value (essentially a picture’s digital fingerprint), which NCMEC can send to search engines such as Google and Microsoft to omit from searches. Currently, it has been able to provide Internet companies with more than 20,000 hash values to avoid.

This technology was developed by Microsoft, who just opened up the doors to its 16,800-square-foot Cybercrime Center at its Redmond, Wash. headquarters last week, focused on using tech to combat digital deviance.

“Microsoft has a zero tolerance approach to child sexual abuse content. If society is to stamp it out, then together we need to tackle the core problems of creation, distribution and consumption," says a Microsoft spokesperson in an e-mail.

However, Mr. Shehan points out that more sophisticated child pornography viewers tend to use peer-to-peer file sharing, so keyword changes will only catch a certain type of Internet browser.

“Some users who are in the beginning stages may use keywords,” he says. “But it certainly will not be the silver bullet to solve this problem.”

Despite tech companies’ efforts to police controversial content, he says access is only one part of the battle against any social ill, whether it be anti-Semitism or child pornography.

“The World Wide Web is just a small piece of the fight, this will more have an effect on how search engines do their business,” Shehan adds. “This is a step in the right direction, but [the fight] will be more than just Google and Microsoft.”

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to

QR Code to Google, Microsoft child pornography omissions only 'a part of the fight'
Read this article in
QR Code to Subscription page
Start your subscription today