Information-theoretic limits on the classification of Gaussian mixtures: Classification on the Grassmann manifold
Published
Journal Article
Motivated by applications in high-dimensional signal processing, we derive fundamental limits on the performance of compressive linear classifiers. By analogy with Shannon theory, we define the classification capacity, which quantifies the maximum number of classes that can be discriminated with low probability of error, and the diversity-discrimination tradeoff, which quantifies the tradeoff between the number of classes and the probability of classification error. For classification of Gaussian mixture models, we identify a duality between classification and communications over non-coherent multiple-antenna channels. This duality allows us to characterize the classification capacity and diversity-discrimination tradeoff using existing results from multiple-antenna communication. We also identify the easiest possible classification problems, which correspond to low-dimensional subspaces drawn from an appropriate Grassmann manifold. © 2013 IEEE.
Full Text
Duke Authors
Cited Authors
- Nokleby, M; Calderbank, R; Rodrigues, MRD
Published Date
- December 1, 2013
Published In
- 2013 Ieee Information Theory Workshop, Itw 2013
Digital Object Identifier (DOI)
- 10.1109/ITW.2013.6691253
Citation Source
- Scopus