AI-enabled suicide prediction tools: ethical considerations for medical leaders

Medical suicide prediction tools: researchers and doctors can use AI techniques such as machine learning to determine patterns of information and behaviour that indicate suicide risk by leveraging data from EMRs, hospital records and potentially other government data sources. Facebook, for instance,...

Full description

Saved in:
Bibliographic Details
Published inBMJ leader Vol. 5; no. 2; pp. 102 - 107
Main Authors D’Hotman, Daniel, Loh, Erwin, Savulescu, Julian
Format Journal Article
LanguageEnglish
Published London BMJ Publishing Group LTD 01.06.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Medical suicide prediction tools: researchers and doctors can use AI techniques such as machine learning to determine patterns of information and behaviour that indicate suicide risk by leveraging data from EMRs, hospital records and potentially other government data sources. Facebook, for instance, has not released data on the effectiveness of its tool or its intervention methods—which range from ‘soft touch’ interventions such as providing information on counselling services, to more intrusive interventions for high-risk cases, such as in the USA, where Facebook staff can call emergency services to an individual’s home if there is an immediate risk to life. A study published in Biomedical Informatics Insights by Coppersmith et al applied machine learning and natural language processing to social media data from a variety of sources (eg, Facebook, Twitter, Instagram, Reddit, Tumblr, Strava and Fitbit, among others) in order to determine suicide risk of attempted suicide.6 By linking medical records (which were used to establish whether users actually attempted suicide, rather than to identify risk)—for which they were granted permission by test subjects—Coppersmith et al demonstrated that their model was up to 10 times more accurate at correctly predicting those ‘at risk’ of attempting suicide when compared with clinician averages (4%–6% vs 40%–60%). [...]both medical and social suicide prediction tools must be reliable and safe, in line with community and ethical expectations.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Commentary-1
content type line 14
ISSN:2398-631X
2398-631X
DOI:10.1136/leader-2020-000275