Skip to main content

A case-based interpretable deep learning model for classification of mass lesions in digital mammography

Publication ,  Journal Article
Barnett, AJ; Schwartz, FR; Tao, C; Chen, C; Ren, Y; Lo, JY; Rudin, C
Published in: Nature Machine Intelligence
December 1, 2021

Interpretability in machine learning models is important in high-stakes decisions such as whether to order a biopsy based on a mammographic exam. Mammography poses important challenges that are not present in other computer vision tasks: datasets are small, confounding information is present and it can be difficult even for a radiologist to decide between watchful waiting and biopsy based on a mammogram alone. In this work we present a framework for interpretable machine learning-based mammography. In addition to predicting whether a lesion is malignant or benign, our work aims to follow the reasoning processes of radiologists in detecting clinically relevant semantic features of each image, such as the characteristics of the mass margins. The framework includes a novel interpretable neural network algorithm that uses case-based reasoning for mammography. Our algorithm can incorporate a combination of data with whole image labelling and data with pixel-wise annotations, leading to better accuracy and interpretability even with a small number of images. Our interpretable models are able to highlight the classification-relevant parts of the image, whereas other methods highlight healthy tissue and confounding information. Our models are decision aids—rather than decision makers—and aim for better overall human–machine collaboration. We do not observe a loss in mass margin classification accuracy over a black box neural network trained on the same data.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

Nature Machine Intelligence

DOI

EISSN

2522-5839

Publication Date

December 1, 2021

Volume

3

Issue

12

Start / End Page

1061 / 1070

Related Subject Headings

  • 46 Information and computing sciences
  • 40 Engineering
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Barnett, A. J., Schwartz, F. R., Tao, C., Chen, C., Ren, Y., Lo, J. Y., & Rudin, C. (2021). A case-based interpretable deep learning model for classification of mass lesions in digital mammography. Nature Machine Intelligence, 3(12), 1061–1070. https://doi.org/10.1038/s42256-021-00423-x
Barnett, A. J., F. R. Schwartz, C. Tao, C. Chen, Y. Ren, J. Y. Lo, and C. Rudin. “A case-based interpretable deep learning model for classification of mass lesions in digital mammography.” Nature Machine Intelligence 3, no. 12 (December 1, 2021): 1061–70. https://doi.org/10.1038/s42256-021-00423-x.
Barnett AJ, Schwartz FR, Tao C, Chen C, Ren Y, Lo JY, et al. A case-based interpretable deep learning model for classification of mass lesions in digital mammography. Nature Machine Intelligence. 2021 Dec 1;3(12):1061–70.
Barnett, A. J., et al. “A case-based interpretable deep learning model for classification of mass lesions in digital mammography.” Nature Machine Intelligence, vol. 3, no. 12, Dec. 2021, pp. 1061–70. Scopus, doi:10.1038/s42256-021-00423-x.
Barnett AJ, Schwartz FR, Tao C, Chen C, Ren Y, Lo JY, Rudin C. A case-based interpretable deep learning model for classification of mass lesions in digital mammography. Nature Machine Intelligence. 2021 Dec 1;3(12):1061–1070.

Published In

Nature Machine Intelligence

DOI

EISSN

2522-5839

Publication Date

December 1, 2021

Volume

3

Issue

12

Start / End Page

1061 / 1070

Related Subject Headings

  • 46 Information and computing sciences
  • 40 Engineering