Beyond worst-case reconstruction in deterministic compressed sensing

Journal Article

The role of random measurement in compressive sensing is analogous to the role of random codes in coding theory. In coding theory, decoders that can correct beyond the minimum distance of a code allow random codes to achieve the Shannon limit. In compressed sensing, the counterpart of minimum distance is the spark of the measurement matrix, i.e., the size of the smallest set of linearly dependent columns. This paper constructs a family of measurement matrices where the columns are formed by exponentiating codewords from a classical binary error-correcting code of block length M. The columns can be partitioned into mutually unbiased bases, and the spark of the corresponding measurement matrix is shown to be O(√M) by identifying a configuration of columns that plays a role similar to that of the Dirac comb in classical Fourier analysis. Further, an explicit basis for the null space of these measurement matrices is given in terms of indicator functions of binary self-dual codes. Reliable reconstruction of k-sparse inputs is shown for k of order M/log(M) which is best possible and far beyond the worst case lower bound provided by the spark. © 2012 IEEE.

Full Text

Duke Authors

Cited Authors

  • Jafarpour, S; Duarte, MF; Calderbank, R

Published Date

  • October 22, 2012

Published In

  • Ieee International Symposium on Information Theory Proceedings

Start / End Page

  • 1852 - 1856

Digital Object Identifier (DOI)

  • 10.1109/ISIT.2012.6283601

Citation Source

  • Scopus