Frontiers in Child and Adolescent Psychiatry (Dec 2023)

The steps that young people and suicide prevention professionals think the social media industry and policymakers should take to improve online safety. A nested cross-sectional study within a Delphi consensus approach

  • Jo Robinson,
  • Jo Robinson,
  • Pinar Thorn,
  • Pinar Thorn,
  • Samuel McKay,
  • Samuel McKay,
  • Hannah Richards,
  • Hannah Richards,
  • Rikki Battersby-Coulter,
  • Rikki Battersby-Coulter,
  • Michelle Lamblin,
  • Michelle Lamblin,
  • Laura Hemming,
  • Laura Hemming,
  • Laura Hemming,
  • Louise La Sala,
  • Louise La Sala

DOI
https://doi.org/10.3389/frcha.2023.1274263
Journal volume & issue
Vol. 2

Abstract

Read online

IntroductionConcerns exist about the relationship between social media and youth self-harm and suicide. Study aims were to examine the extent to which young people and suicide prevention professionals agreed on: (1) the utility of actions that social media companies currently take in response to self-harm and suicide-related content; and (2) further steps that the social media industry and policymakers could take to improve online safety.MethodsThis was a cross-sectional survey study nested within a larger Delphi expert consensus study. A systematic search of peer-reviewed and grey literature and roundtables with social media companies, policymakers, and young people informed the questionnaire development. Two expert panels were developed to participate in the overarching Delphi study, one of young people and one of suicide prevention experts; of them 43 young people and 23 professionals participated in the current study. The proportion of participants “strongly agreeing”, “somewhat agreeing”, “neither agreeing nor disagreeing”, and “somewhat disagreeing” or “strongly disagreeing” for each item were calculated; items that achieved =>80% of agreement from both panels were strongly endorsed.ResultsThere was limited consensus across the two groups regarding the utility of the safety strategies currently employed by companies. However, both groups largely agreed that self-harm and suicide-related content should be restricted. Both groups also agreed that companies should have clear policies covering content promoting self-harm or suicide, graphic depictions of self-harm or suicide, and games, pacts and hoaxes. There was moderate agreement that companies should use artificial intelligence to send resources to users at risk. Just over half of professionals and just under half of young people agreed that social media companies should be regulated by government. There was strong support for governments to require schools to educate students on safe online communication. There was also strong support for international collaboration to better coordinate efforts.DiscussionStudy findings reflect the complexity associated with trying to minimise the risks of communicating online about self-harm or suicide whilst capitalising on the benefits. However, a clear message was the need for better collaboration between policymakers and the social media industry and between government and its international counterparts

Keywords