Discrimination on the grassmann manifold: Fundamental limits of subspace classifiers
Repurposing tools and intuitions from Shannon theory, we derive fundamental limits on the reliable classification of high-dimensional signals from low-dimensional features. We focus on the classification of linear and affine subspaces and suppose the features to be noisy linear projections. Leveraging a syntactic equivalence of discrimination between subspaces and communications over vector wireless channels, we derive asymptotic bounds on classifier performance. First, we define the classification capacity, which characterizes necessary and sufficient relationships between the signal dimension, the number of features, and the number of classes to be discriminated, as all three quantities approach infinity. Second, we define the diversitydiscrimination tradeoff, which characterizes relationships between the number of classes and the misclassification probability as the signal-to-noise ratio approaches infinity. We derive inner and outer bounds on these measures, revealing precise relationships between signal dimension and classifier performance. © 2014 IEEE.