Doctors conducted suicide risk assessments for 42% of patients flagged by interruptive alerts, compared with only 4% with passive alerts.
The results showed that interruptive alerts were significantly more effective.
Image Credit: aeGAG - stock.adobe.com
A new study from Vanderbilt University Medical Center (VUMC) emphasizes the potential of artificial intelligence (AI) in supporting suicide prevention efforts within standard medical practices.1 Published in JAMA Network Open, the research shows that AI-driven clinical alerts can assist doctors in identifying patients at high risk for suicide and encourage them to conduct risk assessments during routine clinic visits.
The team conducted a study testing the Vanderbilt Suicide Attempt and Ideation Likelihood (VSAIL) model, an AI system designed to analyze electronic health records for estimating a patient's 30-day suicide risk. This research took place across 3 neurology clinics at VUMC. The study compared 2 types of alerts: interruptive alerts, which involved immediate pop-ups that disrupted doctors' workflows, and passive alerts, where risk information was embedded in patient charts without interruptions.
The results showed that interruptive alerts were significantly more effective. Doctors conducted suicide risk assessments for 42% of patients flagged by interruptive alerts, compared with only 4% with passive alerts. Many individuals who die by suicide have consulted a health care provider in the year leading up to their death, frequently for reasons not related to mental health, according to the lead investigator, Colin Walsh, MD, MA, associate professor of Biomedical Informatics, Medicine and Psychiatry.2
"Universal screening isn’t practical in every setting,” Walsh said in a statement. “We developed VSAIL to identify high-risk patients and prompt focused screening conversations."
The study pointed to the rising suicide rates in the US, particularly over the past generation, with an estimated 14.2 deaths per 100,000 people annually; it is the nation’s 11th leading cause of death.1 Research shows that 77% of individuals who die by suicide had contact with primary care providers, which supports the need for targeted screening.
The VSAIL system was designed to address this gap by flagging patients most in need of assessment using routine health record data. During earlier testing without active alerts, the system effectively identified high-risk patients, with 1 in 23 flagged individuals later reporting suicidal thoughts.
The recent study involved 7732 patient visits over 6 months, generating 596 screening alerts for high-risk individuals. Neurology clinics were chosen for the trial due to the association between certain neurological conditions and increased suicide risk.
While the interruptive alerts prompted significantly more screenings, no patients in either alert group experienced suicidal ideation or attempts during the 30-day follow-up. This finding suggests that the system can facilitate preventive measures without immediate adverse events but also underscores the need for further evaluation.
The researchers emphasized the importance of balancing the benefits of interruptive alerts with their potential drawbacks, including "alert fatigue," a phenomenon where frequent notifications overwhelm clinicians.
"The automated system flagged only about 8% of all patient visits for screening," Walsh stated.2 "This selective approach makes it more feasible for busy clinics to implement suicide prevention efforts."
The researchers suggested that similar systems could be tested in other medical settings to extend their reach. With suicide prevention remaining a critical public health priority, AI systems like VSAIL offer a potential avenue for improving care and saving lives.
"Health care systems need to balance the effectiveness of interruptive alerts against their potential downsides," Walsh stated. "But these results suggest that automated risk detection combined with well-designed alerts could help us identify more patients who need suicide prevention services."
References
1. Walsh CG, Ripperger MA, Novak L, et al. Risk model–guided clinical decision support for suicide screening: a randomized clinical trial. JAMA Netw Open. 2025;8(1):e2452371. doi:10.1001/jamanetworkopen.2024.52371
2. AI system helps doctors identify patients at risk for suicide. Vanderbilt University Medical Center. News release. January 3, 2025. Accessed January 6, 2025. https://www.eurekalert.org/news-releases/1069306
Laundromats as a New Frontier in Community Health, Medicaid Outreach
May 29th 2025Lindsey Leininger, PhD, and Allister Chang, MPA, highlight the potential of laundromats as accessible, community-based settings to support Medicaid outreach, foster trust, and connect families with essential health and social services.
Listen
Managed Care Reflections: A Q&A With Melinda B. Buntin, PhD
June 2nd 2025To mark the 30th anniversary of The American Journal of Managed Care (AJMC), each issue in 2025 includes a special feature: reflections from a thought leader on what has changed—and what has not—over the past 3 decades and what’s next for managed care. The June issue features a conversation with Melinda B. Buntin, PhD, a health economist and a Bloomberg Distinguished Professor at the Johns Hopkins Bloomberg School of Public Health and Carey Business School.
Read More
Mental Health Awareness: Transforming Workplace Support and Engagement
May 20th 2025Explore how employers enhance workplace mental health by fostering a culture of support and engagement, focusing on holistic well-being strategies in this interview with Jim Kinville, MA, University of Pittsburgh Medical Center.
Listen