Combining deep learning methods and human knowledge to identify abnormalities in computed tomography (CT) reports
Many researchers in the field of machine learning have addressed the problem of detecting anomalies within Computed Tomography (CT) scans. Training these machine learning algorithms requires a dataset of CT scans with identified anomalies (labels), usually, in specific organs. This represents a problem, since it requires experts to review thousands of images in order to create labels for these data. We aim to decrease human burden at labeling CT scans by developing a model that identifies anomalies within plain-text-based reports that then could be further used as a method to create labels for models based on CT scans. This study contains more than 4800 CT reports from Duke Health System, for which we aim to identify organ specific abnormalities. We propose an iterative active learning approach that consists of building a machine learning model to classify CT reports by abnormalities in different organs and then improving it by actively adding reports sequentially. At each iteration, clinical experts review the report that provides the model with highest expected information gain. This process is done in real time by using a web interface. Then, this datum is used by the model to improve its performance. We evaluated the performance of our method for abnormalities in kidneys and lungs. When starting with a model trained on 99 reports, the results show the model achieves an Area Under the Curve (AUC) score of 0.93 on the test set after adding 130 actively labeled reports to the model from an unlabeled pool of 4,000. This suggests that a set of labeled CT scans can be obtained with significantly reduced human work by combining machine learning techniques and clinical experts' knowledge.
Benitez, M; Tian, J; Kelly, M; Selvakumaran, V; Phelan, M; Mazurowski, M; Lo, JY; Rubin, GD; Henao, R
Volume / Issue
International Standard Serial Number (ISSN)
International Standard Book Number 13 (ISBN-13)
Digital Object Identifier (DOI)