Stochastic Spectral Descent for Discrete Graphical Models

Published

Journal Article

© 2015 IEEE. Interest in deep probabilistic graphical models has increased in recent years, due to their state-of-the-art performance on many machine learning applications. Such models are typically trained with the stochastic gradient method, which can take a significant number of iterations to converge. Since the computational cost of gradient estimation is prohibitive even for modestly sized models, training becomes slow and practically usable models are kept small. In this paper we propose a new, largely tuning-free algorithm to address this problem. Our approach derives novel majorization bounds based on the Schatten-∞ norm. Intriguingly, the minimizers of these bounds can be interpreted as gradient methods in a non-Euclidean space. We thus propose using a stochastic gradient method in non-Euclidean space. We both provide simple conditions under which our algorithm is guaranteed to converge, and demonstrate empirically that our algorithm leads to dramatically faster training and improved predictive ability compared to stochastic gradient descent for both directed and undirected graphical models.

Full Text

Duke Authors

Cited Authors

  • Carlson, D; Hsieh, YP; Collins, E; Carin, L; Cevher, V

Published Date

  • March 1, 2016

Published In

Volume / Issue

  • 10 / 2

Start / End Page

  • 296 - 311

International Standard Serial Number (ISSN)

  • 1932-4553

Digital Object Identifier (DOI)

  • 10.1109/JSTSP.2015.2505684

Citation Source

  • Scopus