Connecting the out-of-sample and pre-image problems in Kernel methods

Journal Article

Kernel methods have been widely studied in the field of pattern recognition. These methods implicitly map, "the kernel trick," the data into a space which is more appropriate for analysis. Many manifold learning and dimensionality reduction techniques are simply kernel methods for which the mapping is explicitly computed. In such cases, two problems related with the mapping arise: The out-of-sample extension and the pre-image computation. In this paper we propose a new pre-image method based on the Nyström formulation for the out-of-sample extension, showing the connections between both problems. We also address the importance of normalization in the feature space, which has been ignored by standard pre-image algorithms. As an example, we apply these ideas to the Gaussian kernel, and relate our approach to other popular pre-image methods. Finally, we show the application of these techniques in the study of dynamic shapes. © 2007 IEEE.

Full Text

Duke Authors

Cited Authors

  • Arias, P; Randall, G; Sapiro, G

Published Date

  • October 11, 2007

Published In

International Standard Serial Number (ISSN)

  • 1063-6919

Digital Object Identifier (DOI)

  • 10.1109/CVPR.2007.383038

Citation Source

  • Scopus