Department of Medicine
Artificial Intelligence and Robotics | Databases and Information Systems | Health and Medical Administration | Health Information Technology | Health Services Administration | Information Literacy | Medical Education
BACKGROUND: Patient portals are becoming more common, and with them, the ability of patients to access their personal electronic health records (EHRs). EHRs, in particular the free-text EHR notes, often contain medical jargon and terms that are difficult for laypersons to understand. There are many Web-based resources for learning more about particular diseases or conditions, including systems that directly link to lay definitions or educational materials for medical concepts.
OBJECTIVE: Our goal is to determine whether use of one such tool, NoteAid, leads to higher EHR note comprehension ability. We use a new EHR note comprehension assessment tool instead of patient self-reported scores.
METHODS: In this work, we compare a passive, self-service educational resource (MedlinePlus) with an active resource (NoteAid) where definitions are provided to the user for medical concepts that the system identifies. We use Amazon Mechanical Turk (AMT) to recruit individuals to complete ComprehENotes, a new test of EHR note comprehension.
RESULTS: Mean scores for individuals with access to NoteAid are significantly higher than the mean baseline scores, both for raw scores (P=.008) and estimated ability (P=.02).
CONCLUSIONS: In our experiments, we show that the active intervention leads to significantly higher scores on the comprehension test as compared with a baseline group with no resources provided. In contrast, there is no significant difference between the group that was provided with the passive intervention and the baseline group. Finally, we analyze the demographics of the individuals who participated in our AMT task and show differences between groups that align with the current understanding of health literacy between populations. This is the first work to show improvements in comprehension using tools such as NoteAid as measured by an EHR note comprehension assessment tool as opposed to patient self-reported scores.
MedlinePlus, crowdsourcing, health literacy, information storage and retrieval, natural language processing, psychometrics
Rights and Permissions
© John P Lalor, Beverly Woolf, Hong Yu. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 11.01.2019. This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.
DOI of Published Version
J Med Internet Res. 2019 Jan 16;21(1):e10793. doi: 10.2196/10793. Link to article on publisher's site
Journal of medical Internet research
Lalor, John P.; Woolf, Beverly; and Yu, Hong, "Improving Electronic Health Record Note Comprehension With NoteAid: Randomized Trial of Electronic Health Record Note Comprehension Interventions With Crowdsourced Workers" (2019). Open Access Articles. 3735.
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.
Artificial Intelligence and Robotics Commons, Databases and Information Systems Commons, Health and Medical Administration Commons, Health Information Technology Commons, Health Services Administration Commons, Information Literacy Commons, Medical Education Commons