Skip to main content

Learning Latent Features with Infinite Nonnegative Binary Matrix Trifactorization

Publication ,  Journal Article
Yang, X; Huang, K; Zhang, R; Hussain, A
Published in: IEEE Transactions on Emerging Topics in Computational Intelligence
December 1, 2018

Nonnegative matrix factorization (NMF) has been widely exploited in many computational intelligence and pattern recognition problems. In particular, it can be used to extract latent features from data. However, previous NMF models often assume a fixed number of features, which are normally tuned and searched using a trial and error approach. Learning binary features is also difficult, since the binary matrix posits a more challenging optimization problem. In this paper, we propose a new Bayesian model, termed the infinite nonnegative binary matrix trifactorization (iNBMT) model. This can automatically learn both latent binary features and feature numbers, based on the Indian buffet process (IBP). It exploits a trifactorization process that decomposes the nonnegative matrix into a product of three components: two binary matrices and a nonnegative real matrix. In contrast to traditional bifactorization, trifactorization can better reveal latent structures among samples and features. Specifically, an IBP prior is imposed on two infinite binary matrices, while a truncated Gaussian distribution is assumed on the weight matrix. To optimize the model, we develop a modified variational-Bayesian algorithm, with iteration complexity one order lower than the recently proposed maximization-expectation-IBP model [1] and the correlated IBP-IBP model [2]. A series of simulation experiments are carried out, both qualitatively and quantitatively, using benchmark feature extraction, reconstruction, and clustering tasks. Comparative results show that our proposed iNBMT model significantly outperforms state-of-the-art algorithms on a range of synthetic and real-world data. The new Bayesian model can thus serve as a benchmark technique for the computational intelligence research community.

Duke Scholars

Published In

IEEE Transactions on Emerging Topics in Computational Intelligence

DOI

EISSN

2471-285X

Publication Date

December 1, 2018

Volume

2

Issue

6

Start / End Page

450 / 463

Related Subject Headings

  • 4611 Machine learning
  • 4603 Computer vision and multimedia computation
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Yang, X., Huang, K., Zhang, R., & Hussain, A. (2018). Learning Latent Features with Infinite Nonnegative Binary Matrix Trifactorization. IEEE Transactions on Emerging Topics in Computational Intelligence, 2(6), 450–463. https://doi.org/10.1109/TETCI.2018.2806934
Yang, X., K. Huang, R. Zhang, and A. Hussain. “Learning Latent Features with Infinite Nonnegative Binary Matrix Trifactorization.” IEEE Transactions on Emerging Topics in Computational Intelligence 2, no. 6 (December 1, 2018): 450–63. https://doi.org/10.1109/TETCI.2018.2806934.
Yang X, Huang K, Zhang R, Hussain A. Learning Latent Features with Infinite Nonnegative Binary Matrix Trifactorization. IEEE Transactions on Emerging Topics in Computational Intelligence. 2018 Dec 1;2(6):450–63.
Yang, X., et al. “Learning Latent Features with Infinite Nonnegative Binary Matrix Trifactorization.” IEEE Transactions on Emerging Topics in Computational Intelligence, vol. 2, no. 6, Dec. 2018, pp. 450–63. Scopus, doi:10.1109/TETCI.2018.2806934.
Yang X, Huang K, Zhang R, Hussain A. Learning Latent Features with Infinite Nonnegative Binary Matrix Trifactorization. IEEE Transactions on Emerging Topics in Computational Intelligence. 2018 Dec 1;2(6):450–463.

Published In

IEEE Transactions on Emerging Topics in Computational Intelligence

DOI

EISSN

2471-285X

Publication Date

December 1, 2018

Volume

2

Issue

6

Start / End Page

450 / 463

Related Subject Headings

  • 4611 Machine learning
  • 4603 Computer vision and multimedia computation