Locally convex kernel mixtures: Bayesian subspace learning

Published

Conference Paper

© 2019 IEEE. Kernel mixture models are routinely used for density estimation. However, in multivariate settings, issues arise in efficiently approximating lower-dimensional structure in the data. For example, it is common to suppose that the density is concentrated near a lower-dimensional non-linear subspace or manifold. Typical kernels used to locally approximate such subspaces are inflexible, so that a large number of components are often needed. We propose a novel class of LOcally COnvex (LOCO) kernels that are flexible in adapting to nonlinear local structure. LOCO kernels are induced by introducing random knots within local neighborhoods, and generating data as a random convex combination of these knots with adaptive weights and an additive noise. For identifiability, we constrain all observations from a particular component to have the same mean. For Bayesian inference subject to this constraint, we develop a hybrid Gibbs sampler and optimization algorithm that incorporates a Lagrange multiplier within a splitting method. The resulting LOCO algorithm is shown to dramatically outperform typical Gaussian mixture models in challenging examples.

Full Text

Duke Authors

Cited Authors

  • Thai, DH; Wu, HT; Dunson, DB

Published Date

  • December 1, 2019

Published In

  • Proceedings 18th Ieee International Conference on Machine Learning and Applications, Icmla 2019

Start / End Page

  • 272 - 275

International Standard Book Number 13 (ISBN-13)

  • 9781728145495

Digital Object Identifier (DOI)

  • 10.1109/ICMLA.2019.00051

Citation Source

  • Scopus