» Articles » PMID: 38609507

Large Language Models Could Change the Future of Behavioral Healthcare: a Proposal for Responsible Development and Evaluation

Overview
Publisher Springer Nature
Specialty Psychiatry
Date 2024 Apr 12
PMID 38609507
Authors
Affiliations
Soon will be listed here.
Abstract

Large language models (LLMs) such as Open AI's GPT-4 (which power ChatGPT) and Google's Gemini, built on artificial intelligence, hold immense potential to support, augment, or even eventually automate psychotherapy. Enthusiasm about such applications is mounting in the field as well as industry. These developments promise to address insufficient mental healthcare system capacity and scale individual access to personalized treatments. However, clinical psychology is an uncommonly high stakes application domain for AI systems, as responsible and evidence-based therapy requires nuanced expertise. This paper provides a roadmap for the ambitious yet responsible application of clinical LLMs in psychotherapy. First, a technical overview of clinical LLMs is presented. Second, the stages of integration of LLMs into psychotherapy are discussed while highlighting parallels to the development of autonomous vehicle technology. Third, potential applications of LLMs in clinical care, training, and research are discussed, highlighting areas of risk given the complex nature of psychotherapy. Fourth, recommendations for the responsible development and evaluation of clinical LLMs are provided, which include centering clinical science, involving robust interdisciplinary collaboration, and attending to issues like assessment, risk detection, transparency, and bias. Lastly, a vision is outlined for how LLMs might enable a new generation of studies of evidence-based interventions at scale, and how these studies may challenge assumptions about psychotherapy.

Citing Articles

Assessing and alleviating state anxiety in large language models.

Ben-Zion Z, Witte K, Jagadish A, Duek O, Harpaz-Rotem I, Khorsandian M NPJ Digit Med. 2025; 8(1):132.

PMID: 40033130 PMC: 11876565. DOI: 10.1038/s41746-025-01512-6.


Automated Digital Safety Planning Interventions for Young Adults: Qualitative Study Using Online Co-design Methods.

Meyerhoff J, Popowski S, Lakhtakia T, Tack E, Kornfield R, Kruzan K JMIR Form Res. 2025; 9:e69602.

PMID: 40009840 PMC: 11904377. DOI: 10.2196/69602.


Probing Public Perceptions of Antidepressants on Social Media: Mixed Methods Study.

Zhu J, Zhang X, Jin R, Jiang H, Kenne D JMIR Form Res. 2025; 9:e62680.

PMID: 40009806 PMC: 11882120. DOI: 10.2196/62680.


Exploring the Ethical Challenges of Conversational AI in Mental Health Care: Scoping Review.

Rahsepar Meadi M, Sillekens T, Metselaar S, van Balkom A, Bernstein J, Batelaan N JMIR Ment Health. 2025; 12:e60432.

PMID: 39983102 PMC: 11890142. DOI: 10.2196/60432.


The externalization of internal experiences in psychotherapy through generative artificial intelligence: a theoretical, clinical, and ethical analysis.

Haber Y, Hadar Shoval D, Levkovich I, Yinon D, Gigi K, Pen O Front Digit Health. 2025; 7:1512273.

PMID: 39968063 PMC: 11832678. DOI: 10.3389/fdgth.2025.1512273.


References
1.
Baumel A, Muench F, Edan S, Kane J . Objective User Engagement With Mental Health Apps: Systematic Search and Panel-Based Usage Analysis. J Med Internet Res. 2019; 21(9):e14567. PMC: 6785720. DOI: 10.2196/14567. View

2.
Bantilan N, Malgaroli M, Ray B, Hull T . Just in time crisis response: suicide alert system for telemedicine psychotherapy settings. Psychother Res. 2020; 31(3):302-312. DOI: 10.1080/10503307.2020.1781952. View

3.
. Guidelines for clinical supervision in health service psychology. Am Psychol. 2015; 70(1):33-46. DOI: 10.1037/a0038112. View

4.
Tanana M, Soma C, Kuo P, Bertagnolli N, Dembe A, Pace B . How do you feel? Using natural language processing to automatically rate emotion in psychotherapy. Behav Res Methods. 2021; 53(5):2069-2082. PMC: 8455714. DOI: 10.3758/s13428-020-01531-z. View

5.
Zhang X, Tanana M, Weitzman L, Narayanan S, Atkins D, Imel Z . You never know what you are going to get: Large-scale assessment of therapists' supportive counseling skill use. Psychotherapy (Chic). 2022; 60(2):149-158. PMC: 10133410. DOI: 10.1037/pst0000460. View