Generative adversarial network-based image completion to identify abnormal locations in digital breast tomosynthesis images

Conference Paper

Deep learning has achieved great success in image analysis and decision making in radiology. However, a large amount of annotated imaging data is needed to construct well-performing deep learning models. A particular challenge in the context of breast cancer is the number of available cases that contain cancer, given the very low prevalence of the disease in the screening population. The question arises whether normal cases, which in the context of breast cancer screening are available in abundance, can be used to train a deep learning model that identifies locations that are abnormal. In this study, we propose to achieve this goal through the generative adversarial network (GAN)-based image completion. Our hypothesis is that if a generative network has a difficulty to correctly complete a part of an image at a certain location, then such a location is likely to represent an abnormality. We test this hypothesis using a dataset of 4348 patients with digital breast tomosynthesis (DBT) imaging from our institution. We trained our model on normal only images, to be able to fill in parts of images that were artificially removed. Then, using an independent test set, at different locations in the images, we measured how difficult it was for the network to reconstruct an artificially removed patch of the image. The difficulty was measured by mean squared error (MSE) between the original removed patch and the reconstructed patch. On average, the MSE was 2.11 times higher (with standard deviation equal to 1.01) at the locations containing expert-annotated cancerous lesions than that at the locations outside those abnormal locations. Our generative approach demonstrates a great potential for using this model to aid breast cancer detection.

Full Text

Duke Authors

Cited Authors

  • Swiecicki, A; Buda, M; Saha, A; Li, N; Ghate, SV; Walsh, R; Mazurowski, MA

Published Date

  • January 1, 2020

Published In

Volume / Issue

  • 11314 /

International Standard Serial Number (ISSN)

  • 1605-7422

International Standard Book Number 13 (ISBN-13)

  • 9781510633957

Digital Object Identifier (DOI)

  • 10.1117/12.2551379

Citation Source

  • Scopus