Skip to main content

Explainability Metrics of Deep Convolutional Networks for Photoplethysmography Quality Assessment.

Publication ,  Journal Article
Zhang, O; Ding, C; Pereira, T; Xiao, R; Gadhoumi, K; Meisel, K; Lee, RJ; Chen, Y; Hu, X
Published in: IEEE access : practical innovations, open solutions
January 2021

Photoplethysmography (PPG) is a noninvasive way to monitor various aspects of the circulatory system, and is becoming more and more widespread in biomedical processing. Recently, deep learning methods for analyzing PPG have also become prevalent, achieving state of the art results on heart rate estimation, atrial fibrillation detection, and motion artifact identification. Consequently, a need for interpretable deep learning has arisen within the field of biomedical signal processing. In this paper, we pioneer novel explanatory metrics which leverage domain-expert knowledge to validate a deep learning model. We visualize model attention over a whole testset using saliency methods and compare it to human expert annotations. Congruence, our first metric, measures the proportion of model attention within expert-annotated regions. Our second metric, Annotation Classification, measures how much of the expert annotations our deep learning model pays attention to. Finally, we apply our metrics to compare between a signal based model and an image based model for PPG signal quality classification. Both models are deep convolutional networks based on the ResNet architectures. We show that our signal-based one dimensional model acts in a more explainable manner than our image based model; on average 50.78% of the one dimensional model's attention are within expert annotations, whereas 36.03% of the two dimensional model's attention are within expert annotations. Similarly, when thresholding the one dimensional model attention, one can more accurately predict if each pixel of the PPG is annotated as artifactual by an expert. Through this testcase, we demonstrate how our metrics can provide a quantitative and dataset-wide analysis of how explainable the model is.

Duke Scholars

Published In

IEEE access : practical innovations, open solutions

DOI

EISSN

2169-3536

ISSN

2169-3536

Publication Date

January 2021

Volume

9

Start / End Page

29736 / 29745

Related Subject Headings

  • 46 Information and computing sciences
  • 40 Engineering
  • 10 Technology
  • 09 Engineering
  • 08 Information and Computing Sciences
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Zhang, O., Ding, C., Pereira, T., Xiao, R., Gadhoumi, K., Meisel, K., … Hu, X. (2021). Explainability Metrics of Deep Convolutional Networks for Photoplethysmography Quality Assessment. IEEE Access : Practical Innovations, Open Solutions, 9, 29736–29745. https://doi.org/10.1109/access.2021.3054613
Zhang, Oliver, Cheng Ding, Tania Pereira, Ran Xiao, Kais Gadhoumi, Karl Meisel, Randall J. Lee, Yiran Chen, and Xiao Hu. “Explainability Metrics of Deep Convolutional Networks for Photoplethysmography Quality Assessment.IEEE Access : Practical Innovations, Open Solutions 9 (January 2021): 29736–45. https://doi.org/10.1109/access.2021.3054613.
Zhang O, Ding C, Pereira T, Xiao R, Gadhoumi K, Meisel K, et al. Explainability Metrics of Deep Convolutional Networks for Photoplethysmography Quality Assessment. IEEE access : practical innovations, open solutions. 2021 Jan;9:29736–45.
Zhang, Oliver, et al. “Explainability Metrics of Deep Convolutional Networks for Photoplethysmography Quality Assessment.IEEE Access : Practical Innovations, Open Solutions, vol. 9, Jan. 2021, pp. 29736–45. Epmc, doi:10.1109/access.2021.3054613.
Zhang O, Ding C, Pereira T, Xiao R, Gadhoumi K, Meisel K, Lee RJ, Chen Y, Hu X. Explainability Metrics of Deep Convolutional Networks for Photoplethysmography Quality Assessment. IEEE access : practical innovations, open solutions. 2021 Jan;9:29736–29745.

Published In

IEEE access : practical innovations, open solutions

DOI

EISSN

2169-3536

ISSN

2169-3536

Publication Date

January 2021

Volume

9

Start / End Page

29736 / 29745

Related Subject Headings

  • 46 Information and computing sciences
  • 40 Engineering
  • 10 Technology
  • 09 Engineering
  • 08 Information and Computing Sciences