Mismatch in the classification of linear subspaces: Upper bound to the probability of error
This paper studies the performance associated with the classification of linear subspaces corrupted by noise with a mismatched classifier. In particular, we consider a problem where the classifier observes a noisy signal, the signal distribution conditioned on the signal class is zero-mean Gaussian with low-rank covariance matrix, and the classifier knows only the mismatched parameters in lieu of the true parameters. We derive an upper bound to the misclassification probability of the mismatched classifier and characterize its behaviour. Specifically, our characterization leads to sharp sufficient conditions that describe the absence of an error floor in the low-noise regime, and that can be expressed in terms of the principal angles and the overlap between the true and the mismatched signal subspaces.