LDMNet: Low Dimensional Manifold Regularized Neural Networks

Journal Article

Deep neural networks have proved very successful on archetypal tasks for which large training sets are available, but when the training data are scarce, their performance suffers from overfitting. Many existing methods of reducing overfitting are data-independent. Data-dependent regularizations are mostly motivated by the observation that data of interest lie close to a manifold, which is typically hard to parametrize explicitly. These methods usually only focus on the geometry of the input data, and do not necessarily encourage the networks to produce geometrically meaningful features. To resolve this, we propose the Low-Dimensional-Manifold-regularized neural Network (LDMNet), which incorporates a feature regularization method that focuses on the geometry of both the input data and the output features. In LDMNet, we regularize the network by encouraging the combination of the input data and the output features to sample a collection of low dimensional manifolds, which are searched efficiently without explicit parametrization. To achieve this, we directly use the manifold dimension as a regularization term in a variational functional. The resulting Euler-Lagrange equation is a Laplace-Beltrami equation over a point cloud, which is solved by the point integral method without increasing the computational complexity. In the experiments, we show that LDMNet significantly outperforms widely-used regularizers. Moreover, LDMNet can extract common features of an object imaged via different modalities, which is very useful in real-world applications such as cross-spectral face recognition.

Full Text

Duke Authors

Cited Authors

  • Zhu, W; Qiu, Q; Huang, J; Calderbank, R; Sapiro, G; Daubechies, I

Published Date

  • December 14, 2018

Published In

Start / End Page

  • 2743 - 2751

International Standard Serial Number (ISSN)

  • 1063-6919

Digital Object Identifier (DOI)

  • 10.1109/CVPR.2018.00290

Citation Source

  • Scopus