Statistical robustness of Markov chain Monte Carlo accelerators

Conference Paper

Statistical machine learning often uses probabilistic models and algorithms, such as Markov Chain Monte Carlo (MCMC), to solve a wide range of problems. Probabilistic computations, often considered too slow on conventional processors, can be accelerated with specialized hardware by exploiting parallelism and optimizing the design using various approximation techniques. Current methodologies for evaluating correctness of probabilistic accelerators are often incomplete, mostly focusing only on end-point result quality ("accuracy"). It is important for hardware designers and domain experts to look beyond end-point "accuracy"and be aware of how hardware optimizations impact statistical properties. This work takes a first step toward defining metrics and a methodology for quantitatively evaluating correctness of probabilistic accelerators. We propose three pillars of statistical robustness: 1) sampling quality, 2) convergence diagnostic, and 3) goodness of fit. We apply our framework to a representative MCMC accelerator and surface design issues that cannot be exposed using only application end-point result quality. We demonstrate the benefits of this framework to guide design space exploration in a case study showing that statistical robustness comparable to floating-point software can be achieved with limited precision, avoiding floating-point hardware overheads.

Full Text

Duke Authors

Cited Authors

  • Zhang, X; Bashizade, R; Wang, Y; Mukherjee, S; Lebeck, AR

Published Date

  • April 19, 2021

Published In

  • International Conference on Architectural Support for Programming Languages and Operating Systems Asplos

Start / End Page

  • 959 - 974

International Standard Book Number 13 (ISBN-13)

  • 9781450383172

Digital Object Identifier (DOI)

  • 10.1145/3445814.3446697

Citation Source

  • Scopus