» Articles » PMID: 31532513

Concordance Between Electronic Clinical Documentation and Physicians' Observed Behavior

Overview
Journal JAMA Netw Open
Specialty General Medicine
Date 2019 Sep 19
PMID 31532513
Citations 14
Authors
Affiliations
Soon will be listed here.
Abstract

Importance: Following the adoption of electronic health records into a regulatory environment designed for paper records, there has been little investigation into the accuracy of physician documentation.

Objective: To quantify the percentage of emergency physician documentation of the review of systems (ROS) and physical examination (PE) that observers can confirm.

Design, Setting, And Participants: This case series took place at emergency departments in 2 academic medical centers between 2016 and 2018. Participants' patient encounters were observed to compare real-time performance with clinical documentation.

Exposures: Resident physicians were shadowed by trained observers for 20 encounters (10 encounters per physician per site) to obtain real-time observational data; associated electronic health record data were subsequently reviewed.

Main Outcomes And Measures: Number of confirmed ROS systems (range, 0-14) divided by the number of documented ROS systems (range, 0-14), and number of confirmed PE systems (range, 0-14) divided by the number of documented PE systems (range, 0-14).

Results: The final study cohort included 9 licensed emergency medicine residents who evaluated a total of 180 patients (mean [SD] age, 48.7 [20.0] years; 91 [50.5%] women). For ROS, physicians documented a median (interquartile range [IQR]) of 14 (8-14) systems, while audio recordings confirmed a median (IQR) of 5 (3-6) systems. Overall, 755 of 1961 documented ROS systems (38.5%) were confirmed by audio recording data. For PE, resident physicians documented a median (IQR) of 8 (7-9) verifiable systems, while observers confirmed a median (IQR) of 5.5 (3-6) systems. Overall, 760 of 1429 verifiable documented PE systems (53.2%) were confirmed by concurrent observation. Interrater reliability for rating of ROS and PE was more than 90% for all measures.

Conclusions And Relevance: In this study of 9 licensed year emergency medicine residents, there were inconsistencies between the documentation of ROS and PE findings in the electronic health record and observational reports. These findings raise the possibility that some documentation may not accurately represent physician actions. Further studies should be undertaken to determine whether this occurrence is widespread. However, because such studies are unlikely to be performed owing to institution-level barriers that exist nationwide, payers should consider removing financial incentives to generate lengthy documentation.

Citing Articles

Shortcomings of ethnicity-based carrier screening for conditions associated with Ashkenazi Jewish ancestry.

Llorin H, Tennen R, Laskey S, Zhan J, Detweiler S, Abul-Husn N Genet Med Open. 2024; 2:101869.

PMID: 39669632 PMC: 11613755. DOI: 10.1016/j.gimo.2024.101869.


The impact of automatic history-taking software on data quality in the cardiology outpatient clinic: Retrospective observational study.

Erden I, Sen A, Erden I Digit Health. 2024; 10:20552076241260155.

PMID: 38832101 PMC: 11146001. DOI: 10.1177/20552076241260155.


Dialysis decision-making process by Chinese American patients at an urban, academic medical center: a retrospective chart review.

Lebovitz A, Schwab S, Richardson M, Meyer K, Sweigart B, Vesel T BMC Palliat Care. 2024; 23(1):25.

PMID: 38273297 PMC: 10809624. DOI: 10.1186/s12904-024-01357-y.


A novel method for evaluating physician communication: A pilot study testing the feasibility of parent-assisted audio recordings via Zoom.

Staras S, Bylund C, Desai S, Harle C, Richardson E, Khalil G PEC Innov. 2022; 1.

PMID: 36212508 PMC: 9534382. DOI: 10.1016/j.pecinn.2022.100020.


Patient-centered quality measurement for opioid use disorder: Development of a taxonomy to address gaps in research and practice.

Kelley A, Incze M, Baylis J, Calder S, Weiner S, Zickmund S Subst Abus. 2022; 43(1):1286-1299.

PMID: 35849749 PMC: 9703846. DOI: 10.1080/08897077.2022.2095082.


References
1.
Hammond K, Helbig S, Benson C, Brathwaite-Sketoe B . Are electronic medical records trustworthy? Observations on copying, pasting and duplication. AMIA Annu Symp Proc. 2004; :269-73. PMC: 1480345. View

2.
. Medicare Program; Revisions to Payment Policies Under the Physician Fee Schedule and Other Revisions to Part B for CY 2018; Medicare Shared Savings Program Requirements; and Medicare Diabetes Prevention Program. Final rule. Fed Regist. 2017; 82(219):52976-3371. View

3.
Hodgson T, Coiera E . Risks and benefits of speech recognition for clinical documentation: a systematic review. J Am Med Inform Assoc. 2015; 23(e1):e169-79. PMC: 4954615. DOI: 10.1093/jamia/ocv152. View

4.
von Elm E, Altman D, Egger M, Pocock S, Gotzsche P, Vandenbroucke J . The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Ann Intern Med. 2007; 147(8):573-7. DOI: 10.7326/0003-4819-147-8-200710160-00010. View

5.
Hogan W, Wagner M . Accuracy of data in computer-based patient records. J Am Med Inform Assoc. 1997; 4(5):342-55. PMC: 61252. DOI: 10.1136/jamia.1997.0040342. View