A user interface to communicate interpretable AI decisions to radiologists
Tools for computer-aided diagnosis based on deep learning have become increasingly important in the medical field. Such tools can be useful, but require effective communication of their decision-making process in order to safely and meaningfully guide clinical decisions. Inherently interpretable models provide an explanation for each decision that matches their internal decision-making process. We present a user interface that incorporates the Interpretable AI Algorithm for Breast Lesions (IAIA-BL) model, which interpretably predicts both mass margin and malignancy for breast lesions. The user interface displays the most relevant aspects of the model's explanation including the predicted margin value, the AI confidence in the prediction, and the two most highly activated prototypes for each case. In addition, this user interface includes full-field and cropped images of the region of interest, as well as a questionnaire suitable for a reader study. Our preliminary results indicate that the model increases the readers' confidence and accuracy in their decisions on margin and malignancy.