Facebook introduces controls for kids. Is it enough?

After a scathing testimony of the harm Facebook platforms can cause younger users, the tech conglomerate is trying to do better by introducing new controls for teens and parents. Critics question the efficacy of the changes and are calling for greater transparency. 

|
Paul Sakuma/AP
An child logs onto Facebook in Palo Alto, California, June 4, 2012. One of the controls Facebook is planning to introduce prompts teenage users to refrain from repeatedly looking at content that's not beneficial to their well-being.

Facebook, in the aftermath of damning testimony that its platforms harm children, will be introducing several features including prompting teens to take a break using its photo sharing app Instagram, and “nudging” teens if they are repeatedly looking at the same content that’s not conducive to their well-being.

The Menlo Park, California-based Facebook is also planning to introduce new controls for adults of teens on an optional basis so that parents or guardians can supervise what their teens are doing online. These initiatives come after Facebook announced late last month that it was pausing work on its Instagram for Kids project. But critics say the plan lacks details and they are skeptical that the new features would be effective.

The new controls were outlined on Sunday by Nick Clegg, Facebook’s vice president for global affairs, who made the rounds on various Sunday news shows including CNN’s “State of the Union” and ABC’s “This Week with George Stephanopoulos” where he was grilled about Facebook’s use of algorithms as well as its role in spreading harmful misinformation ahead of the Jan. 6 Capitol riots.

“We are constantly iterating in order to improve our products,” Mr. Clegg told Dana Bash on “State of the Union” Sunday. “We cannot, with a wave of the wand, make everyone’s life perfect. What we can do is improve our products, so that our products are as safe and as enjoyable to use.”

Mr. Clegg said that Facebook has invested $13 billion over the past few years in making sure to keep the platform safe and that the company has 40,000 people working on these issues. And while Mr. Clegg said that Facebook has done its best to keep harmful content out of its platforms, he says he was open for more regulation and oversight.

“We need greater transparency,” he told CNN’s Ms. Bash. He noted that the systems that Facebook has in place should be held to account, if necessary, by regulation so that “people can match what our systems say they’re supposed to do from what actually happens.”

The flurry of interviews came after whistleblower Frances Haugen, a former data scientist with Facebook, went before Congress last week to accuse the social media platform of failing to make changes to Instagram after internal research showed apparent harm to some teens and of being dishonest in its public fight against hate and misinformation. Ms. Haugen’s accusations were supported by tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company’s civic integrity unit.

Josh Golin, executive director of Fairplay, a watchdog for the children and media marketing industry, said that he doesn’t think introducing controls to help parents supervise teens would be effective since many teens set up secret accounts any way. He was also dubious about how effective nudging teens to take a break or move away from harmful content would be. He noted Facebook needs to show exactly how they would implement it and offer research that shows these tools are effective.

“There is tremendous reason to be skeptical,” he said. He added that regulators need to restrict what Facebook does with its algorithms.

He said he also believes that Facebook should cancel its Instagram project for kids.

When Mr. Clegg was grilled by both Ms. Bash and Mr. Stephanopoulos in separate interviews about the use of algorithms in amplifying misinformation ahead of Jan. 6 riots, he responded that if Facebook removed the algorithms people would see more, not less hate speech, and more, not less, misinformation.

Mr. Clegg told both hosts that the algorithms serve as “giant spam filters.”

Democratic Sen. Amy Klobuchar of Minnesota, who chairs the Senate Commerce Subcommittee on Competition Policy, Antitrust, and Consumer Rights, told Ms. Bash in a separate interview Sunday that it’s time to update children’s privacy laws and offer more transparency in the use of algorithms.

“I appreciate that he is willing to talk about things, but I believe the time for conversation is done,” said Ms. Klobuchar, referring to Mr. Clegg’s plan. “The time for action is now.”

This story was reported by The Associated Press.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Facebook introduces controls for kids. Is it enough?
Read this article in
https://www.csmonitor.com/Technology/2021/1012/Facebook-introduces-controls-for-kids.-Is-it-enough
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe