Tools to reshape digital etiquette

A survey of tech experts predicts that the future in online behavior could lead to less hate and misinformation with the right incentives and artificial intelligence. Individuals must be empowered to conduct civil public discourse.

|
AP Photo
Meighan Stone places a support banner with flowers outside the door of Comet Ping Pong pizza shop, in Washington, Dec. 5, 2016. A fake news story prompted a man to fire a rifle inside the popular pizza place as he attempted to "self-investigate" a conspiracy theory that Hillary Clinton was running a child sex ring from there, police said.

Someday, before you read a sentence like this, a new type of intelligent software will advise you if the content is civil and safe. It will warn you of malicious language, fake news, or other types of toxic behavior now commonly found in the digital universe. And it will do so based on the social norms of a community that values the widest common good.

That, at least, is what a majority of some 1,500 technology experts seem to predict, based on a survey by Pew Research Center and Elon University, released in a report last week. The survey looks at the future of online etiquette and whether public discourse will “become more or less shaped by bad actors, harassment, trolls, and an overall tone of griping, distrust, and disgust.” (The report might have added Russian influence in election campaigns.)

The survey also asks these experts to forecast whether it is possible to filter and label the current rage, hate, and misinformation of cyberspace – while also keeping a proper balance of security, privacy, and freedom of speech. The result was only slightly hopeful.

While 39 percent of respondents expect the online future will be “more shaped” by negative activities, 42 expect “no major change” and 19 percent said the internet will be “less shaped” by harassment, trolling, and distrust.

Several solutions are necessary, many of which require the help of artificial intelligence and new social norms. One is to define the limits of online anonymity that now allow abusive behavior. Another is to reduce incentives for those companies that rely on clicks for profit to encourage incendiary activities online. Most of all, people must be given more choices, not less, to control their online experience.

The report provides one compelling prediction from Amy Webb of the Future Today Institute: “Right now, many technology-focused companies are working on ‘conversational computing,’ and the goal is to create a seamless interface between humans and machines.... In the coming decade, you will have more and more conversations with operating systems, and especially with chatbots, which are programmed to listen to, learn from and react to us. You will encounter bots first throughout social media, and during the next decade, they will become pervasive digital assistants helping you on many of the systems you use.”

Yes, the next generation of Siris, Alexas, and OK Googles will become tools to refine civil discourse and sift the chaff of hate and lies from the wheat of civic engagement and honest information. Traditional institutions, such as news media, are already doing less of that sifting and sorting. And government can do only so much without crossing a line on basic freedom.

The internet has given a platform to individuals to declare their voice, sometimes for ill. The coming digital tools must give individuals ways to find and create a civil and safe space. That sort of intelligence would be anything but artificial.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Tools to reshape digital etiquette
Read this article in
https://www.csmonitor.com/Commentary/the-monitors-view/2017/0402/Tools-to-reshape-digital-etiquette
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe