» Articles » PMID: 39444019

Artificial Intelligence in Healthcare: a Scoping Review of Perceived Threats to Patient Rights and Safety

Overview
Publisher Biomed Central
Date 2024 Oct 24
PMID 39444019
Authors
Affiliations
Soon will be listed here.
Abstract

Background: The global health system remains determined to leverage on every workable opportunity, including artificial intelligence (AI) to provide care that is consistent with patients' needs. Unfortunately, while AI models generally return high accuracy within the trials in which they are trained, their ability to predict and recommend the best course of care for prospective patients is left to chance.

Purpose: This review maps evidence between January 1, 2010 to December 31, 2023, on the perceived threats posed by the usage of AI tools in healthcare on patients' rights and safety.

Methods: We deployed the guidelines of Tricco et al. to conduct a comprehensive search of current literature from Nature, PubMed, Scopus, ScienceDirect, Dimensions AI, Web of Science, Ebsco Host, ProQuest, JStore, Semantic Scholar, Taylor & Francis, Emeralds, World Health Organisation, and Google Scholar. In all, 80 peer reviewed articles qualified and were included in this study.

Results: We report that there is a real chance of unpredictable errors, inadequate policy and regulatory regime in the use of AI technologies in healthcare. Moreover, medical paternalism, increased healthcare cost and disparities in insurance coverage, data security and privacy concerns, and bias and discriminatory services are imminent in the use of AI tools in healthcare.

Conclusions: Our findings have some critical implications for achieving the Sustainable Development Goals (SDGs) 3.8, 11.7, and 16. We recommend that national governments should lead in the roll-out of AI tools in their healthcare systems. Also, other key actors in the healthcare industry should contribute to developing policies on the use of AI in healthcare systems.

References
1.
Meehan A, Lewis S, Fazel S, Fusar-Poli P, Steyerberg E, Stahl D . Clinical prediction models in psychiatry: a systematic review of two decades of progress and challenges. Mol Psychiatry. 2022; 27(6):2700-2708. PMC: 9156409. DOI: 10.1038/s41380-022-01528-4. View

2.
Isbanner S, OShaughnessy P, Steel D, Wilcock S, Carter S . The Adoption of Artificial Intelligence in Health Care and Social Services in Australia: Findings From a Methodologically Innovative National Survey of Values and Attitudes (the AVA-AI Study). J Med Internet Res. 2022; 24(8):e37611. PMC: 9446139. DOI: 10.2196/37611. View

3.
Joloudari J, Joloudari E, Saadatfar H, Ghasemigol M, Razavi S, Mosavi A . Coronary Artery Disease Diagnosis; Ranking the Significant Features Using a Random Trees Model. Int J Environ Res Public Health. 2020; 17(3). PMC: 7037941. DOI: 10.3390/ijerph17030731. View

4.
Al Kuwaiti A, Nazer K, Al-Reedy A, Al-Shehri S, Al-Muhanna A, Subbarayalu A . A Review of the Role of Artificial Intelligence in Healthcare. J Pers Med. 2023; 13(6). PMC: 10301994. DOI: 10.3390/jpm13060951. View

5.
Kitzmiller R, Vaughan A, Skeeles-Worley A, Keim-Malpass J, Yap T, Lindberg C . Diffusing an Innovation: Clinician Perceptions of Continuous Predictive Analytics Monitoring in Intensive Care. Appl Clin Inform. 2019; 10(2):295-306. PMC: 6494616. DOI: 10.1055/s-0039-1688478. View