What was in the news and academic press about self-injury in September?
Suicide Prevention Month
September marks Suicide Prevention Month, the occasion to remember that while Non-Suicidal Self-Injury is, by definition, not engaged in to take one’s life, it remains a risk factor for suicide. According to Joiner’s interpersonal model of suicide, someone engages in suicidal behavior if they both have a desire for suicide and an acquired capability for suicide. Repetitive self-injury can lead to habituation, reducing sensitivity to pain and fear of death, and therefore increase capability for suicide. While non-suicidal, self-injury is a serious disorder which should not be ignored1.
Presentations to the Emergency departments for self-harm during the Covid-19 pandemic
September of 2023 saw the publication of research about the impact of the Covid-19 pandemic on presentations to the Emergency Departments for self-harm (self-injury and self-poisoning), and for suicidal ideation. Three studies, one analyzing data from Northern Ireland2 and two from Canada3, 4, revealed an initial decrease in presentations in the first months of the pandemic, followed by an overall return to normal. Yet, visits to emergency departments for self-harm among children and adolescents increased in both countries. In Canada, this increase was higher among females.
The increase in child and adolescent visits to the Emergency Departments for self-harm might be explained by the absence of school in-person, social isolation, loss of routines and familial stress during the pandemic. Nevertheless, such an increase was already occurring before the pandemic, potentially due to the negative impact of social media, inappropriate content found online, and cyberbullying.
The UK passes the Online Safety Bill
In September, the Online Safety Bill passed all its parliamentary stages and will now become law in the United Kingdom. It aims at making the Internet safer, particularly for children and teenagers under 18, by preventing them from accessing illegal, harmful or age-inappropriate content such as pornography or promotion of suicide and self-harm. Tech companies such as social-media platforms or search engine providers will be held responsible for the content they host. They could face fines if they do not remove illegal content rapidly or fail to protect young users from harmful content5, 6.
The new law comes after a year of intense debate about the responsibility of social media companies, following 14-year-old Molly Russell’s death by suicide in 2017 and the ruling by a coroner that she died from “an act of self-harm while suffering from depression and the negative effects of online content”. For possibly one of the first times, social media platforms’ direct responsibility in a death has been officially pointed out7.
Yet, the new law raises concerns over human rights, with privacy and freedom of speech potentially being threatened. It is also difficult for websites to properly check users’ age. Instead of banning content for their safety, some young people recommend better education to understand risks and more understanding adults to help them6.
Machine Learning use in self-harm prediction and prevention
The past 5 years have seen a growing focus on Machine Learning in mental health research. Machine Learning is a branch of Artificial Intelligence (AI) that can process a vast amount of data simultaneously, recognize complex patterns and interactions in it, and gradually improve its accuracy. Data can come from various sources, such as medical records or social media8, 9, 10.
In the future, its use in medical settings could help predict and identify high-risk patients, as traditional predictive screening tools and questionnaires lack accuracy. People identified as being at high-risk of engagement in suicidal behavior or persisent self-injury could benefit from better targeted prevention and treatment8, 9, 10.
Yet, for now Machine Learning remains experimental and is not sufficiently accurate. For example, in a research about suicide attempts, it only identified a few high-risk suicidal individuals, with many being wrongly classified as lower risks which could deprive them from resources and preventive measures9. Despite needs for improvement, Machine Learning remains a promising tool.
Discussion
September’s news and advances in research enable us to think about the role of technologies, the cost of freedom and the need for human contact. The use of technology is heavily debated, with the impact of social media on mental health being increasingly pointed out, and AI generating fear for some. Yet, as can be seen with Machine Learning, technology can be used productively to improve lives and even prevent suicide. Yet it raises issues concerning the use and collection of data, and its potential for being misused. Similar concerns stem from the Online Safety Bill, with the definition of what is deemed “harmful” being subjective, beyond what is illegal.
These three stories also stress the importance of human contact and relationship. Social isolation has had devastating effects during the Covid-19 pandemic. Furthermore, better education, understanding, and communication might make youth more safe than strict banning of harmful content online, particularly if underlying issues that lead young people to search and be receptive to such content are not addressed. Finally, while Machine Learning could become a wonderful tool in the future, it should remain complementary and never replace human assessment and presence, which is desperately needed at a time when most services become automated and limited.
References
[1] Lewis, S. P., & Hasking, P. A. (2023). Self-Injury and Suicidal Thoughts and Behaviors. In Understanding Self-Injury: A Person-Centered Approach. Oxford University Press, Incorporated.
[2] Paterson, E. N., Kent, L., O’Reilly, D., O’Hagan, D., O’Neill, S. M., & Mcguire, A. (2023). Impact of the COVID-19 pandemic on self-harm and self-harm/suicide ideation: population-wide data linkage study and time series analysis. The British Journal of Psychiatry, 1-9. https://doi.org/10.1192/bjp.2023.76
[3] Mitchell, R. H. B., Toulany, A., Chung, H., Cohen, E., Fu, L., Strauss, R., Vigod, S. N., Stukel, T. A., Moran, K., Guttmann, A., Kurdyak, P., Artani, A., Kopec, M., & Saunders, N. R. (2023). Self-harm among youth during the first 28 months of the COVID-19 pandemic in Ontario, Canada: a population-based study. Canadian Medical Association Journal, 195(36), 1210-1220. https://doi.org/10.1503/cmaj.230127
[4] Poonai, N., Freedman, S. B., Newton, A. S., Sawyer, S., Gaucher, N., Ali, S., Wright, B., Miller, M. E., Meter, A., Fitzpatrick, E., Jabbour, M., Zemek, R., Eltorki, M., & Doan, Q. (2023). Emergency department visits and hospital admissions for suicidal ideation, self-poisoning and self-harm among adolescents in Canada during the COVID-19 pandemic. Canadian Medical Association Journal, 195(36), 1221-1230. https://doi.org/10.1503/cmaj.220507
[5] Department for Science, Innovation and Technology & Donelan, M. (2023, September 19). Britain makes internet safer, as Online Safety Bill finished and ready to become law. GOV.UK. Retrieved September 30, 2023, from https://www.gov.uk/government/news/britain-makes-internet-safer-as-online-safety-bill-finished-and-ready-to-become-law
[6] Phippen, A. (2023, September 26). Online safety bill: why making the UK the ‘safest place to go online’ is not as easy as the government claims. The Conversation. Retrieved September 30, 2023, from https://theconversation.com/online-safety-bill-why-making-the-uk-the-safest-place-to-go-online-is-not-as-easy-as-the-government-claims-214290
[7] Walker, A., & North London Coroner’s Service. (2022, October 13). Prevention of future deaths report. Retrieved October, 2023, from https://www.judiciary.uk/wp-content/uploads/2022/10/Molly-Russell-Prevention-of-future-deaths-report-2022-0315_Published.pdf
[8] Mason, G., Auerbach, R. P., & Stewart, J. G. (2023). Predicting the Trajectory of Non-suicidal Self-injury Among Adolescents. https://doi.org/10.31234/osf.io/nz2ms
[9] Su, R., John, J. R., & Lin, P.-I. (2023). Machine learning-based prediction for self-harm and suicide attempts in adolescents. Psychiatry Research, 328. https://doi.org/10.1016/j.psychres.2023.115446
[10] Tiffin, P. A., Leelamanthep, S., Paton, L. W., & Perry, A. E. (2023). Predicting self-harm at one year in female prisoners: a retrospective cohort study using machine learning. https://doi.org/10.1101/2023.09.20.23295770