UMMS Affiliation

Meyers Primary Care Institute; Department of Medicine

Publication Date

2021-05-13

Document Type

Article

Disciplines

Computer Sciences | Health Information Technology | Health Services Administration | Health Services Research | Information Literacy

Abstract

BACKGROUND: Interventions to define medical jargon have been shown to improve electronic health record (EHR) note comprehension among crowdsourced participants on Amazon Mechanical Turk (AMT). However, AMT participants may not be representative of the general population or patients who are most at-risk for low health literacy.

OBJECTIVE: In this work, we assessed the efficacy of an intervention (NoteAid) for EHR note comprehension among participants in a community hospital setting.

METHODS: Participants were recruited from Lowell General Hospital (LGH), a community hospital in Massachusetts, to take the ComprehENotes test, a web-based test of EHR note comprehension. Participants were randomly assigned to control (n=85) or intervention (n=89) groups to take the test without or with NoteAid, respectively. For comparison, we used a sample of 200 participants recruited from AMT to take the ComprehENotes test (100 in the control group and 100 in the intervention group).

RESULTS: A total of 174 participants were recruited from LGH, and 200 participants were recruited from AMT. Participants in both intervention groups (community hospital and AMT) scored significantly higher than participants in the control groups (P < .001). The average score for the community hospital participants was significantly lower than the average score for the AMT participants (P < .001), consistent with the lower education levels in the community hospital sample. Education level had a significant effect on scores for the community hospital participants (P < .001).

CONCLUSIONS: Use of NoteAid was associated with significantly improved EHR note comprehension in both community hospital and AMT samples. Our results demonstrate the generalizability of ComprehENotes as a test of EHR note comprehension and the effectiveness of NoteAid for improving EHR note comprehension. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 13.05.2021.

Keywords

comprehension, crowdsourcing, efficacy, electronic health record, health literacy, information storage and retrieval, intervention, literacy, natural language processing, psychometrics

Rights and Permissions

Copyright ©John P Lalor, Wen Hu, Matthew Tran, Hao Wu, Kathleen M Mazor, Hong Yu. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 13.05.2021. This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.

DOI of Published Version

10.2196/26354

Source

Lalor JP, Hu W, Tran M, Wu H, Mazor KM, Yu H. Evaluating the Effectiveness of NoteAid in a Community Hospital Setting: Randomized Trial of Electronic Health Record Note Comprehension Interventions With Patients. J Med Internet Res. 2021 May 13;23(5):e26354. doi: 10.2196/26354. PMID: 33983124; PMCID: PMC8160802. Link to article on publisher's site

Journal/Book/Conference Title

Journal of medical Internet research

Related Resources

Link to Article in PubMed

PubMed ID

33983124

Creative Commons License

Creative Commons Attribution 4.0 License
This work is licensed under a Creative Commons Attribution 4.0 License.

Share

COinS