COMING SOON! PQDT Open is getting a new home!

ProQuest Open Access Dissertations & Theses will remain freely available as part of a new and enhanced search experience at www.proquest.com.

Questions? Please refer to this FAQ.

Dissertation/Thesis Abstract

Information Retrieval in Clinical Chart Reviews
by Ye, Cheng, Ph.D., Vanderbilt University, 2019, 167; 13812003
Abstract (Summary)

Medical researchers rely on chart reviews, in which a user manually goes through a large number of electronic medical records (EMRs), to search for evidence to answer a specific medical question. Unfortunately, scrolling through vast amounts of clinical text to produce labels is time-consuming and expensive. For example, at Vanderbilt University Medical Center, it currently costs $109 per hour for a service that pays a nurse to review patient charts and produce labels. Therefore, specific methods are needed to i) reduce the cost of doing chart reviews and ii) to support medical researchers to identify relevant text within medical notes more efficiently.

First, to reduce the cost of doing chart reviews, we developed the VBOSSA crowdsourcing platform that protects patient privacy and maintains a professional clinical crowd including medical students, nursing students and faculty from the Vanderbilt University Medical Center. With the support of the VBOSSA, medical researchers have saved over 700 hours of manual chart review with relatively accurate results (average accuracy of 86%) and average cost around $20 per hour.

Second, to boost the efficiency of crowd workers in retrieving information from unstructured medical notes, we developed a Google-style EMR search engine, which provides high-quality query recommendation and automatically refines query while the user is doing a search and reviewing documents. Underpinning the EMR search engine are three novel approaches to:

(1) Extract clinically similar terms from multiple EMR-based word embeddings;

(2) Represent the medical contexts of clinical terms in a usage vector space and then leverage the usage vector space to better learn the users’ preferred similar terms;

(3) Propose two novel ranking metrics, negative guarantee ratio(NGR) and critical document, based on the user experience analysis in chart reviews.

The EMR search engine was systematically evaluated and achieved high performance in different information retrieval tasks, user studies, timing studies, and query recommendation tasks. We also evaluated different ranking and learning-to-rank methods using the NGR and critical document ranking metrics and discuss future directions in developing high-quality ranking methods to support chart reviews.

Indexing (document details)
Advisor: Fabbri, Daniel
Commitee: Malin, Bradley, Vorobeychik, Yevgeniy, Kunda, Maithilee, Chen, You
School: Vanderbilt University
Department: Computer Science
School Location: United States -- Tennessee
Source: DAI-B 82/9(E), Dissertation Abstracts International
Source Type: DISSERTATION
Subjects: Computer science, Information Technology, Bioinformatics
Keywords: Electronic Medical Records, Medical usage contexts, Query expansion, Search engines, Vector space model, Clinical chart
Publication Number: 13812003
ISBN: 9798582576440
Copyright © 2021 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy
ProQuest