Skip to main content
Full access
Professional News
Published Online: 13 April 2017

Experts React to Facebook’s Updated Suicide Prevention Tools

Psychiatrists spoke favorably about Facebook’s latest efforts to educate users about resources and build communities of support around people who may be at risk of suicide, but also cautioned that much remains unknown about how safe and effective such technology might be.
Facebook made headlines last month after the company announced it was updating tools to make it easier for users to connect friends they suspect may be experiencing thoughts of suicide with resources in real time.
For years, the social media giant has encouraged users to report suicidal content to the company’s Help Center and offered tips on how to offer support to a friend in need. The tools unveiled in March expand upon this effort by creating a mechanism for the social networking site’s more than 1.8 billion users to flag concerning videos as they are broadcast over Facebook Live and live chat with professionals at crisis centers such as the National Suicide Prevention Line through Facebook Messenger.
“When someone is thinking of suicide or hurting themselves, we’ve built infrastructure to give their friends and community tools that could save their life,” Facebook co-founder and CEO Mark Zuckerberg wrote in a post to the Facebook community in February.
Still, he acknowledged, there is room for improvement. “There are billions of posts, comments, and messages across our services each day, and since it’s impossible to review all of them, we review content once it is reported to us. There have been terribly tragic events—like suicides, some live streamed—that perhaps could have been prevented if someone had realized what was happening and reported them sooner.” He added, “Artificial intelligence can help provide a better approach.”
“[W]e still do not have data about how reliable or valid these technology-based suicide assessments really are for social media, as actual research has been limited.”—John Torous, M.D.
John Torous, M.D., chair of APA’s Smartphone App Evaluation Work Group, applauded Facebook’s efforts. “Any time you are able to identify and connect a person experiencing mental health issues with resources and help is definitely a good thing,” he told Psychiatric News. “But we still do not have data about how reliable or valid these technology-based suicide assessments really are for social media, as actual research has been limited.”
Torous is the co-director of the digital psychiatry program at Beth Israel Deaconess Medical Center.

Facebook Looking Ahead to AI

In the announcement outlining the updates to the Facebook’s suicide prevention tools, the company said it is testing pattern recognition to identify and streamline the reporting of suicidal posts. Currently, users must see a post by a friend suggesting thoughts of suicide or self-injury and report the content to Facebook. Artificial intelligence (AI) and pattern recognition based on posts previously reported for suicide could one day make the option to report a post about “suicide or self-injury” more prominent on concerning posts or even flag such posts to automatically be called to the attention of Facebook representatives.
“We are starting this limited test in the United States and will continue working closely with suicide prevention experts to understand other ways we can use technology to help provide support,” the announcement stated.
Psychiatric News made multiple attempts to reach Facebook for more information on how tests of AI are being introduced on the site and how psychiatrists might obtain more information about the effort. At press time, the company had not responded.

Service Raises Privacy Issue

“We’re all looking for human touch and support—that’s the social aspect of human beings. Psychiatry offers a patient the sense of connection and not being alone.”—John Luo, M.D.
“It’s great to hear Facebook is doing something, … but we need to tread carefully,” John Luo, M.D., health science clinical professor in psychiatry and director of the psychiatry residency program at the UC Riverside School of Medicine, told Psychiatric News.
“As a clinician, I know that predicting a patient who is suicidal is murky enough. … No matter how good [Facebook’s] algorithm is, there will be false positives,” he continued. “There is also the challenge of balancing privacy versus safety.”
Torous agreed, pointing to the public outcry in 2014 over Samaritans Radar, an app that tracked words and key phrases on Twitter and alerted users when someone they followed was using language suggesting distress. Within days of the app’s launch, people took to social media to voice concerns that the information could be used to target vulnerable people on Twitter. In less than two weeks, the app was suspended. This episode illustrated the challenges that experts face as they try to develop online tools that offer appropriate support to people with mental health problems without compromising the user’s safety and privacy.
“Psychiatrists should certainly be excited by efforts to use technology to support people with mental illness,” Torous said. “But, as a field, we have high standards and expect a certain level of evidence. We expect to know how things work. We only know of the Samaritan app issues because it gained public attention.”
The mental health community could benefit greatly from knowing more about Facebook’s latest efforts, he added.

Social Media Could Fill a Gap

Previous studies suggest that some people disclose more personal information on a computer than to a person, which could create new opportunities for social media to connect people with mental illness with care that might otherwise be missed, APA President Maria A. Oquendo, M.D., Ph.D., told Psychiatric News. Social media tools could also potentially decrease stigma, she added.
Efforts to expand use of social media to reach people with mental illness “should raise awareness among psychiatrists that many of their patients are likely using social networks,” Torous added. “If a patient is expressing suicidal thoughts on a platform such as Facebook, this would be important information for the treating psychiatrist to know.”
Torous encouraged psychiatrists to have a conversation with patients about their experiences on social media. “There is likely a value in asking patients about their experiences using social networks: Do they find them helpful? Are they promoting fear? Anxiety? Are they experiencing cyberbullying? The only way to obtain this information directly is by having an open dialogue with the patient.”
Luo added that while he supports using social media tools to connect patients with mental illness to care, he cautions against becoming overly reliant on them.
“Unless society changes to point that we prefer engagement that is not personal, … I don’t think psychiatrists are going to be replaced by robots or bot engines,” he said. “We’re all looking for human touch and support—that’s the social aspect of human beings. Psychiatry offers a patient the sense of connection and not being alone.” ■
Facebook’s March announcement on suicide prevention tools can be accessed here. Zuckerberg’s letter to users is available here. More on reporting suicidal content to the Facebook Help Center is located here.

Information & Authors

Information

Published In

History

Published online: 13 April 2017
Published in print: April 8, 2017 – April 21, 2017

Keywords

  1. Facebook
  2. Suicide
  3. Maria Oquendo
  4. John Luo
  5. John Torous
  6. Artificial intelligence

Authors

Details

Metrics & Citations

Metrics

Citations

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

View Options

View options

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share