© 2024 Texas Public Radio
Real. Reliable. Texas Public Radio.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Facebook's Suicide Prevention Tools Connect Friends, Test Privacy

People who struggle with suicidal thoughts will often reach out to friends and family first. But when our social circle lives online these days, the biggest social media networks grapple with how to intervene and with getting users the right kind of help.

Facebook is the latest social media network to roll out support resources for suicide prevention. The company is now trying to combat suicide by doing what it does best — connecting friends.

It announced the new measures in a blog post late last month:

"Besides encouraging them to connect with a mental health expert at the National Suicide Prevention Lifeline, we now also give them the option of reaching out to a friend, and provide tips and advice on how they can work through these feelings."

These expanded resources come as a result of Facebook's collaboration with suicide prevention groups including , , the and .

The company introduced suicide support resources in 2011. Previously, Facebook users had the same option to report a troubling post to be reviewed by employees. And before that, the site simply said "contact a hotline."

Scottye Cash, a professor of social work at Ohio State University, has researched the roles mobile tech and social network sites play in adolescent suicide.

She has found that youth increasingly reach out through social networking sites over traditional forms of help-seeking such as calling a suicide prevention hotline. And the more depressed people get, the less they turn to face-to-face help.

She wants to know whether, on both ends, Facebook's new method works well.

"People that made the initial report and those who received the report — did they find this was something that is acceptable to them?" Cash asks. "That's the part that we need to evaluate: Did they feel it was an overstep or violation."

She says risk assessments should be implemented to help people determine whether the post is an immediate concern.

Once you put out that language, Cash adds, "You can have a fairly streamlined process to help people make best decisions, and handle the legality of it."

Suicide intervention is just one of the privacy issues Facebook has taken on under public scrutiny.

Last year, Facebook users were uneasy to learn that tens of thousands of accounts had become the subject of a psychological experiment in which the company manipulated individual timelines.

At the same time, the researchers of the study concluded that Facebook was the best online platform to reliably assess personalities. After all, it's the biggest.

And one initiative is deploying this big data to predict the suicide risk of veteran soldiers by monitoring social media posts among other behavioral factors.

Facebook's not the only social media giant to launch help resources for suicide threats.

Last year, Twitter teamed up with the suicide prevention charity Samaritans to launch the Samaritans Radar app. The app's algorithm worked to detect keywords and phrases that suggested suicidal behavior and would alert the user account via email with guidance and a link to the tweet. The invasive aspect caused a heavy backlash and Samaritans decided to pull the app within a week of its launch.

In South Korea, where suicide rates are the highest among developed nations, the government education ministry announced last week it was backing a smartphone app that performs a similar function: screen students' social media posts and Web searches for words related to suicide. Instead of sending resources to the teens, as Facebook does, the South Korean app will alert parents that their child is determined to be at risk.

Reddit has a SuicideWatch forum. Last year, a Reddit Minecraft forum talked a teen out of suicide.

At the other extreme, another Reddit user says he was hospitalized against his will after sharing a post expressing suicidal thoughts last year.

And getting help from sharing has its own threats. Suicide risk spikes following hospitalization for mental health disorders, reports the American Foundation for Suicide Prevention.

Reddit's forum has a sidebar that lists do's and don'ts guidelines for users responding to vulnerable posts.

But after seeing Reddit's peer-to-peer style forum, psychologist Sally Spencer-Thomas says, "It's very triggering and all we see and feel is their pain without additional conversation or knowledge of how it turned out. It's like having a room full of suicidal people and I think that can make people feel even worse."

Spencer-Thomas has been following Facebook's new implementations since last year. She's the co-founder and CEO of the Carson J Spencer Foundation, which recognized Facebook for its efforts in expanding suicide resources.

She's excited to see this new initiative mostly because it's focused on connection and not responding from a place of fear.

"Before this, the approach was hot potato. Anytime some kind of suicide came through on social media, they would call the authority, and pass it on," Spencer-Thomas says. "That pathway was not helpful for anyone, including the authority."

People who are sending a distress call want their friends to help; they don't want Facebook to respond, she adds. "This tool gives friends something to do and communicate in constructive ways."

And she doesn't see any privacy issues.

"The more we can have conversations about it that aren't doused in fear, that are human to human, the better it is," Spencer-Thomas says.

Right now, Facebook's new suicide resources are available to about half of U.S. users. The company plans to reach the entire U.S. in the next couple of months before extending these updates abroad.

Emma Bowman is an intern with NPR Digital News. NPR's Elise Hu contributed to this post from Seoul.

Copyright 2020 NPR. To see more, visit https://www.npr.org.