Bridging technical and clinical needs: A case study on developing a suicide risk prediction algorithm using EHR-data in real-world mental health care

Research output: Working paperScientificpeer-review

Abstract

Background: Artificial intelligence (AI) offers potential solutions to address the challenges faced by a strained mental healthcare system, such as increasing demand for care, staff shortages and pressured accessibility. While developing AI-based tools for clinical practice is technically feasible and has the potential of producing real-world impact, only few are actually implemented into clinical practice. Implementation starts at the algorithm development phase, as this phase bridges theoretical innovation and practical application. The design and the way the AI tool is developed may either facilitate or hinder later implementation and use.

Objective: This is an exploratory, qualitative case study of the development process of a suicide risk prediction algorithm, employing a hybrid approach with both inductive and deductive analysis. Data were collected through desk research, reflective team meetings, and iterative feedback sessions with the development team. Thematic analysis was used to identify development challenges and the team’s responses, as well as to derive key considerations for future implementations.

Methods: An exploratory, multimethod case study was conducted, employing a hybrid approach with both inductive and deductive analysis. Data were collected through desk research, reflective team meetings, and iterative feedback sessions with the development team. Thematic analysis was used to identify development challenges and the team’s responses, as well as to derive key considerations for future implementations.

Results: Key challenges included defining, operationalizing, and measuring suicide incidents within EHRs due to issues such as missing data, underreporting, and differences between data sources. Predicting factors were identified by consulting clinical experts, however, psychosocial variables had to be constructed as they could not directly be extracted from EHR data. Risk of bias occurred when traditional suicide prevention questionnaires, unequally distributed across patients, were used as input. Analyzing unstructured data by Natural Language Processing (NLP) was challenging due to data noise but ultimately enabled successful sentiment analysis, which provided dynamic, clinically relevant information for the algorithm. A complex model (XGBoost) enhanced predictive accuracy but posed challenges regarding understandability, which was highly valued by clinicians.

Conclusions: To advance mental healthcare as a data-driven field, several critical considerations must be addressed: ensuring robust data governance and quality, fostering cultural shifts in data documentation practices, establishing mechanisms for continuous monitoring of AI tool usage, mitigating risks of bias, balancing predictive performance with explainability, and maintaining a clinician "in-the-loop" approach. Future research should prioritize sociotechnical aspects related to the implementation and daily use of AI in mental healthcare practice. The use of the algorithm is still exploratory and not yet deployed in practice. Prerequisitions for actual use will be studied in a later pilot before the algorithm will be deployed.
Original languageEnglish
PublisherOSF Preprints
DOIs
Publication statusPublished - 2025

Publication series

NameJMIR Medical Informatics
PublisherJMIR PUBLICATIONS, INC
ISSN (Print)2291-9694

Keywords

  • Implementation science
  • artificial intelligence
  • prediction algorithms
  • electronic health records
  • mental health services
  • suicide prevention

Fingerprint

Dive into the research topics of 'Bridging technical and clinical needs: A case study on developing a suicide risk prediction algorithm using EHR-data in real-world mental health care'. Together they form a unique fingerprint.

Cite this