Skip to main content

Two-layer Mixture of Factor Analyzers with Joint Factor Loading

Publication ,  Conference
Yang, X; Huang, K; Zhang, R; Goulermas, JY
Published in: Proceedings of the International Joint Conference on Neural Networks
September 28, 2015

Dimensionality Reduction (DR) is a fundamental yet active research topic in pattern recognition and machine learning. When used in classification, previous research usually performs DR separately, and then inputs the reduced features to other available models, e.g., Gaussian Mixture Model (GMM). Such independent learning could however significantly limit the classification performance, since the optimal subspace given by a particular DR approach may not be appropriate for the following classification model. More seriously, for high-dimensional data classification in the face of a limited number of samples (called small sample size or S3 problem), independent learning of DR and classification model may even deteriorate the classification accuracy. To solve this problem, we propose a joint learning model, called Two-layer Mixture of Factor Analyzers with Joint Factor Loading (2L-MJFA) for classification. More specifically, our proposed model enjoys a two-layer mixture structure, or a mixture of mixtures structure, with each component (representing each specific class) as another mixture model of Factor Analyzer (MFA). Importantly, all the involved factor analyzers are intentionally designed to share the same loading matrix. On one hand, such joint loading matrix can be considered as the dimensionality reduction matrix; on the other hand, a joint common matrix would largely reduce the parameters, making the proposed algorithm very suitable for S3 problems. We describe our model definition and propose a modified EM algorithm to optimize the model. A series of experiments demonstrates that our proposed model significantly outperforms the other three competitive algorithms on five data sets.

Duke Scholars

Published In

Proceedings of the International Joint Conference on Neural Networks

DOI

Publication Date

September 28, 2015

Volume

2015-September
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Yang, X., Huang, K., Zhang, R., & Goulermas, J. Y. (2015). Two-layer Mixture of Factor Analyzers with Joint Factor Loading. In Proceedings of the International Joint Conference on Neural Networks (Vol. 2015-September). https://doi.org/10.1109/IJCNN.2015.7280350
Yang, X., K. Huang, R. Zhang, and J. Y. Goulermas. “Two-layer Mixture of Factor Analyzers with Joint Factor Loading.” In Proceedings of the International Joint Conference on Neural Networks, Vol. 2015-September, 2015. https://doi.org/10.1109/IJCNN.2015.7280350.
Yang X, Huang K, Zhang R, Goulermas JY. Two-layer Mixture of Factor Analyzers with Joint Factor Loading. In: Proceedings of the International Joint Conference on Neural Networks. 2015.
Yang, X., et al. “Two-layer Mixture of Factor Analyzers with Joint Factor Loading.” Proceedings of the International Joint Conference on Neural Networks, vol. 2015-September, 2015. Scopus, doi:10.1109/IJCNN.2015.7280350.
Yang X, Huang K, Zhang R, Goulermas JY. Two-layer Mixture of Factor Analyzers with Joint Factor Loading. Proceedings of the International Joint Conference on Neural Networks. 2015.

Published In

Proceedings of the International Joint Conference on Neural Networks

DOI

Publication Date

September 28, 2015

Volume

2015-September