Can Facebook's new approach help prevent suicides?
Following a number of widely-publicized suicides on Facebook's new Live feature, the tech giant has expanded its suicide prevention tools. The goal: empower its vast online community to reach out and prevent suicides.
One reason people may commit suicide, behavioral scientists say, is that they feel isolated from – or not supported by – their communities. With its latest rollout of suicide prevention tools, Facebook aims to empower its vast online community to take action and prevent these deaths.
On Wednesday, the tech giant announced the latest elements of its suicide-prevention strategy. Under the new changes, a Facebook user who is worried that friend streaming on Facebook Live may be contemplating suicide can report the video to Facebook, which will then provide resources, including suicide prevention tips and ways to get help via live chat, in real-time to both users. And if friends report that someone they know might be considering suicide, Facebook will direct them to tools to help them reach out.
Facebook's initiative may help tackle one of the most pressing social challenges today. The fact that this intervention comes from Facebook may be particularly valuable, experts suggest.
“Social media is the preferred means of communication for the younger generations dealing with the precarious transition from childhood to adulthood,” writes Scott Ridgway, executive director of the Tennessee Suicide Prevention Network, in an email to The Christian Science Monitor, saying the new tools “provide an opportunity for people to get help for these issues.”
Since the 1950s, instances of suicide have approximately tripled. Globally, there is one suicide every 40 seconds, Facebook said in the company’s news release.
The rise of the Internet has presented a particular challenge. Widely publicized suicides can raise the likelihood of “copycat” suicide attempts, as Stacy Teicher Khadaroo reported for the Monitor in 2013. Such deaths have been broadcast on Facebook itself, using the new Live feature, USA Today reported. (It’s unclear precisely how many suicides have been recorded on the platform.)
Suicide prevention groups are working to inculcate resilience and encourage healthy coping strategies, and crisis hotlines like the National Suicide Prevention Lifeline provide 24-hour support. Facebook has been partnering with some of these groups for a decade.
And Facebook’s latest innovations make a positive contribution to prevention efforts, observers suggest. Compared with previous tools, they connect people more directly, making them potentially more effective, says John Ackerman with the Center for Suicide Prevention and Research at Nationwide Children’s Hospital in Columbus, Ohio.
The new approach provides “very active steps for people who care about the person who is at risk for suicide,” he tells the Monitor.
Integrating suicide prevention into Facebook’s platform may help people who care reach a key constituency: young people. Suicide is the second leading cause of death among Americans ages 10 to 34, according to the Centers for Disease Control and Prevention.
Social media is “incredibly integrated with the way most young people live their lives, which makes it even more important that we have resources there for them,” Dr. Ackerman says.
It’s particularly positive that users are actively directed to help, writes David Luxton, affiliate associate professor of psychiatry at the University of Washington in Seattle, in an email to the Monitor: “This approach is far better than simply detecting risk content and flashing the link or phone number to a crisis line that requires the user to go somewhere else or take additional steps to get immediate help.”
And by creating a sense of community even when friends are physically distant, Facebook’s new suicide prevention tools may fill an important niche, Mr. Ridgway indicates.
“They connect people who are isolated – not just physically, but emotionally as well – not only with potential helpers, but also people who care about them even though they’re not in the same city or state,” he explains. “Anything that breaks through the wall, regardless of whether that barrier is self-enforced or result of genuine distance, is crucial.”
Facebook’s previous approach, which directed people contemplating suicide to a crisis hotline number and other services, was helpful, Ackerman says. But that personal touch can really make a difference, he emphasizes. At the hospital where he works, forging individual connections with people who reach out over the Internet has become social media policy.
Of course, there’s always room for improvement, observers agree. Ackerman would like to see Facebook tweak the way users flag suicide-related content to make it more supportive. At present, when users report posts, they have to choose the option, “I think it shouldn’t be on Facebook,” limiting free expression and potentially creating “additional stigma” around suicide, he says.
There’s also a question of whether Facebook users will abuse the suicide-reporting tools. But, says Ackerman, “the likelihood of that is far outweighed by the benefits.” The real challenge, he says, is to make sure people know about – and feel comfortable using – the resources.
Getting involved well before an individual contemplates suicide is the ultimate goal, suggests Mark LoMurray, director of Sources of Strength, a universal suicide prevention program focused on “upstream” prevention. The program teaches people to identify strengths and internalize their “strength stories,” he tells the Monitor in an email, which can encourage people to get help early, stay connected with others, and “[use] clusters of support in day to day life.”
For social media sites, one way to achieve that kind of impact may be to monitor cyberbullying and online harassment, Ridgway explains.
And technology will likely become increasingly important to those efforts, suggests Professor Luxton, who recently completed a book on the role of artificial intelligence in mental health care.
“I think that we will see more advances in the technologies that are used to help in suicide and other risk behavior (e.g., cyberbullying, illicit drug use, etc.) prevention,” he writes. “Much of the technology is behind the scenes, such as the use of AI and machine learning techniques that monitor and identify risk related content.”