Skip to main content

How to Reduce Dimension with PCA and Random Projections?

Publication ,  Journal Article
Yang, F; Liu, S; Dobriban, E; Woodruff, DP
Published in: IEEE Transactions on Information Theory
December 1, 2021

In our 'big data' age, the size and complexity of data is steadily increasing. Methods for dimension reduction are ever more popular and useful. Two distinct types of dimension reduction are 'data-oblivious' methods such as random projections and sketching, and 'data-aware' methods such as principal component analysis (PCA). Both have their strengths, such as speed for random projections, and data-adaptivity for PCA. In this work, we study how to combine them to get the best of both. We study 'sketch and solve' methods that take a random projection (or sketch) first, and compute PCA after. We compute the performance of several popular sketching methods (random iid projections, random sampling, subsampled Hadamard transform, CountSketch, etc) in a general 'signal-plus-noise' (or spiked) data model. Compared to well-known works, our results are: 1) give asymptotically exact results; 2) apply when the signal components are only slightly above the noise, but the projection dimension is non-negligible. We also study stronger signals allowing more general covariance structures. We find that: 1) signal strength decreases under projection in a delicate way depending on the structure of the data and the sketching method; 2) orthogonal projections are slightly more accurate; 3) randomization does not hurt too much, due to concentration of measure; 4) the CountSketch can be somewhat improved by a normalization method. Our results have implications for statistical learning and data analysis. We also illustrate that the results are highly accurate in simulations and in analyzing empirical data.

Duke Scholars

Published In

IEEE Transactions on Information Theory

DOI

EISSN

1557-9654

ISSN

0018-9448

Publication Date

December 1, 2021

Volume

67

Issue

12

Start / End Page

8154 / 8189

Related Subject Headings

  • Networking & Telecommunications
  • 4613 Theory of computation
  • 4006 Communications engineering
  • 1005 Communications Technologies
  • 0906 Electrical and Electronic Engineering
  • 0801 Artificial Intelligence and Image Processing
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Yang, F., Liu, S., Dobriban, E., & Woodruff, D. P. (2021). How to Reduce Dimension with PCA and Random Projections? IEEE Transactions on Information Theory, 67(12), 8154–8189. https://doi.org/10.1109/TIT.2021.3112821
Yang, F., S. Liu, E. Dobriban, and D. P. Woodruff. “How to Reduce Dimension with PCA and Random Projections?IEEE Transactions on Information Theory 67, no. 12 (December 1, 2021): 8154–89. https://doi.org/10.1109/TIT.2021.3112821.
Yang F, Liu S, Dobriban E, Woodruff DP. How to Reduce Dimension with PCA and Random Projections? IEEE Transactions on Information Theory. 2021 Dec 1;67(12):8154–89.
Yang, F., et al. “How to Reduce Dimension with PCA and Random Projections?IEEE Transactions on Information Theory, vol. 67, no. 12, Dec. 2021, pp. 8154–89. Scopus, doi:10.1109/TIT.2021.3112821.
Yang F, Liu S, Dobriban E, Woodruff DP. How to Reduce Dimension with PCA and Random Projections? IEEE Transactions on Information Theory. 2021 Dec 1;67(12):8154–8189.

Published In

IEEE Transactions on Information Theory

DOI

EISSN

1557-9654

ISSN

0018-9448

Publication Date

December 1, 2021

Volume

67

Issue

12

Start / End Page

8154 / 8189

Related Subject Headings

  • Networking & Telecommunications
  • 4613 Theory of computation
  • 4006 Communications engineering
  • 1005 Communications Technologies
  • 0906 Electrical and Electronic Engineering
  • 0801 Artificial Intelligence and Image Processing