Suicide is a leading cause of death among youths in the United States, and survey data indicate that approximately one in five youths had serious thoughts about suicide in the past year. Research on effective strategies to identify and assist social media users who experience a mental health crisis is limited. Moreover, the legal landscape is unclear on the responsibilities of social media platforms regarding safety and monitoring of their young users’ mental health. Safe Social Spaces (SSS) is a community-based online crisis intervention program for youths who post content raising concerns about a mental health crisis.
SSS was created by YouthLine, a service specializing in youth crisis intervention provided by Lines for Life, an Oregon-based nonprofit organization dedicated to suicide prevention. Since 2019, SSS has interacted with English-speaking young users on social media platforms. Currently, SSS engages with users on Discord, TalkLife, and Vent, which host millions of accounts and contain forums for users to share experiences of emotional health problems.
SSS staff must have at least an associate degree and complete >55 hours of training in crisis intervention and gatekeeper training, including Mental Health First Aid for youths and safeTALK. General training is coordinated by Lines for Life’s assistant director of youth development, training, and quality assurance, and position-specific SSS training is facilitated by the assistant director of clinical operations, who together have >20 years of experience. SSS staff log onto the social media platforms and scroll through recent posts to identify content suggesting a mental health crisis, such as suicidal ideation, self-harm, or symptoms of poor mental health. After identifying such posts, staff review content, tone, and previous posts to gauge the posters’ age in order to contact those ages <24 years. SSS staff, who do not identify themselves as such unless queried by the poster, then send a private message to the poster responding to what the poster shared, establishing rapport, and expressing concern about the poster’s well-being to initiate a dialogue. If the poster responds, the conversation is continued until a safety plan is created. If the poster either does not respond or stops responding before a safety plan has been developed, SSS staff send mental health resources to the poster. During each shift, SSS staff aim to establish at least six new contacts.
For every contact initiated, SSS creates a record with demographic information gleaned from the poster’s profile or the conversation. SSS also tracks daily metrics, including number of message responses, conversations developed from responses, and outcomes that avoided self-harm or prevented suicide attempts. To determine whether a harmful outcome was avoided, SSS staff apply a rubric based on the Applied Suicide Intervention Skill Training to conversation content. Staff consider whether there was imminent concern for self-harm or suicidal ideation and categorize the severity of the contact based by the poster expressed suicidal ideation or a suicide plan or intent and whether this content was shared publicly or privately. For example, one SSS staff chatted with a youth who reported having interpersonal conflict with family members, a suicide plan, and lethal means. After the conversation, the youth said that they were no longer planning on suicide and felt more hopeful and committed to living, a result classified as an avoided outcome. As of early 2024, SSS has made >3,000 contacts, received 1,834 responses (61% response rate), and had 163 avoided outcomes.
SSS employs a scoring plan for safety, based on a combined rating of a poster’s receptiveness to safety planning and the specificity of the safety plan. Finally, a reporting flowchart outlines steps if there are concerns about a poster that necessitate contacting emergency medical services or child protective services. The flowchart also contains the emergency contact for each social platform to which SSS may reach out, but later actions by the platform are not tracked.
SSS has challenges and limitations. Scalability is a key implementation challenge. Social media platforms have large numbers of total posts, and much time is spent manually screening the platforms to identify posts suggesting acute mental health problems. Methods to increase efficiency of this process, such as the use of artificial intelligence, are worth consideration. Other needs include identifying other platforms frequented by youths at risk for suicide and expanding the SSS intervention to them. Verifying posters’ identity or tracking outcomes is not feasible because all available information is based on posters’ self-reports. Finally, although we have attempted to collaborate with the platforms, no memorandums of understanding or other formal agreements are in place.
U.S. mental health services for youths have a gap between service demand and delivery. SSS serves as an example of a service that can help support youths at risk for suicide who might not otherwise receive help.