Skip to main content

Galen Reeves

Associate Professor in the Department of Electrical and Computer Engineering
Electrical and Computer Engineering
140 Science Dr., 321 Gross Hall, Durham, NC 27708

Selected Publications


Reed-Muller Codes on BMS Channels Achieve Vanishing Bit-Error Probability for all Rates Below Capacity

Journal Article IEEE Transactions on Information Theory · February 1, 2024 This paper considers the performance of Reed-Muller (RM) codes transmitted over binary memoryless symmetric (BMS) channels under bitwise maximum-a-posteriori (bit-MAP) decoding. Its main result is that, for a fixed BMS channel, the family of binary RM code ... Full text Cite

Achieving Capacity on Non-Binary Channels with Generalized Reed-Muller Codes

Conference IEEE International Symposium on Information Theory - Proceedings · January 1, 2023 Recently, the authors showed that Reed-Muller (RM) codes achieve capacity on binary memoryless symmetric (BMS) channels with respect to bit error rate. This paper extends that work by showing that RM codes defined on non-binary fields, known as generalized ... Full text Cite

Erratum: Approximating posteriors with high-dimensional nuisance parameters via integrated rotated Gaussian approximation (Biometrika (2021) 108 (269-282) DOI: 10.1093/biomet/asaa068)

Journal Article Biometrika · March 1, 2022 In the main paper under subsection -3.2. Bayesian variable selection-, all references to -5.2- should read: -3.1-. Under subsection -5.2. Bayesian variable selection-, the reference to -5.3 and 6- should read: -S5.3 and S6-. These errors have now been corr ... Full text Cite

Fundamental limits for rank-one matrix estimation with groupwise heteroskedasticity

Conference Proceedings of Machine Learning Research · January 1, 2022 Low-rank matrix recovery problems involving high-dimensional and heterogeneous data appear in applications throughout statistics and machine learning. The contribution of this paper is to establish the fundamental limits of recovery for a broad class of th ... Cite

k-Sliced Mutual Information: A Quantitative Study of Scalability with Dimension

Conference Advances in Neural Information Processing Systems · January 1, 2022 Sliced mutual information (SMI) is defined as an average of mutual information (MI) terms between one-dimensional random projections of the random variables. It serves as a surrogate measure of dependence to classic MI that preserves many of its properties ... Cite

Gaussian Approximation of Quantization Error for Estimation from Compressed Data

Journal Article IEEE Transactions on Information Theory · August 1, 2021 We consider the distributional connection between the lossy compressed representation of a high-dimensional signal $X$ using a random spherical code and the observation of $X$ under an additive white Gaussian noise (AWGN). We show that the Wasserstein dist ... Full text Cite

Approximating posteriors with high-dimensional nuisance parameters via integrated rotated Gaussian approximation.

Journal Article Biometrika · June 2021 Posterior computation for high-dimensional data with many parameters can be challenging. This article focuses on a new method for approximating posterior distributions of a low- to moderate-dimensional parameter in the presence of a high-dimensional or oth ... Full text Cite

Convergence of Gaussian-smoothed optimal transport distance with sub-gamma distributions and dependent samples

Conference Proceedings of Machine Learning Research · January 1, 2021 The Gaussian-smoothed optimal transport (GOT) framework, recently proposed by Goldfeld et al., scales to high dimensions in estimation and provides an alternative to entropy regularization. This paper provides convergence guarantees for estimating the GOT ... Cite

The Gaussian equivalence of generative models for learning with shallow neural networks

Conference Proceedings of Machine Learning Research · January 1, 2021 Understanding the impact of data structure on the computational tractability of learning is a key challenge for the theory of neural networks. Many theoretical works do not explicitly model training data, or assume that inputs are drawn component-wise inde ... Cite

A Two-Moment Inequality with Applications to Rényi Entropy and Mutual Information.

Journal Article Entropy (Basel, Switzerland) · November 2020 This paper explores some applications of a two-moment inequality for the integral of the rth power of a function, where 0 Full text Cite

Information-theoretic limits for the matrix tensor product

Journal Article IEEE Journal on Selected Areas in Information Theory · November 1, 2020 This article studies a high-dimensional inference problem involving the matrix tensor product of random matrices. This problem generalizes a number of contemporary data science problems including the spiked matrix models used in sparse principal component ... Full text Cite

Information-theoretic limits of a multiview low-rank symmetric spiked matrix model

Conference IEEE International Symposium on Information Theory - Proceedings · June 1, 2020 We consider a generalization of an important class of high-dimensional inference problems, namely spiked symmetric matrix models, often used as probabilistic models for principal component analysis. Such paradigmatic models have recently attracted a lot of ... Full text Cite

The all-or-nothing phenomenon in sparse linear regression

Journal Article Mathematical Statistics and Learning · January 1, 2020 We study the problem of recovering a hidden binary k-sparse p-dimensional vector β from n noisy linear observations Y = Xβ + W, where Xij are i.i.d. N (0,1) and Wi are i.i.d. N (0, σ2). A closely related hypothesis testing problem is to distinguish the pai ... Full text Cite

Gaussian Mixture Models for Stochastic Block Models with Non-Vanishing Noise

Journal Article 2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2019 - Proceedings · December 1, 2019 Community detection tasks have received a lot of attention across statistics, machine learning, and information theory with work concentrating on providing theoretical guarantees for different methodological approaches to the stochastic block model. Recent ... Full text Cite

All-or-Nothing Phenomena: From Single-Letter to High Dimensions

Conference 2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2019 - Proceedings · December 1, 2019 We consider the problem of estimating a p -dimensional vector beta from n observations Y=Xbeta+W, where beta-jmathopsimmathrmi.i.d.pi for a real-valued distribution pi with zero mean and unit variance' X-ijmathopsimmathrmi.i.d.mathcalN(0,1), and W-imathops ... Full text Cite

Mutual Information in Community Detection with Covariate Information and Correlated Networks

Conference 2019 57th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2019 · September 1, 2019 We study the problem of community detection when there is covariate information about the node labels and one observes multiple correlated networks. We provide an asymptotic upper bound on the per-node mutual information as well as a heuristic analysis of ... Full text Cite

The Geometry of Community Detection via the MMSE Matrix

Journal Article IEEE International Symposium on Information Theory - Proceedings · July 1, 2019 The information-theoretic limits of community detection have been studied extensively for network models with high levels of symmetry or homogeneity. The contribution of this paper is to study a broader class of network models that allow for variability in ... Full text Cite

Gaussian Approximation of Quantization Error for Estimation from Compressed Data

Conference IEEE International Symposium on Information Theory - Proceedings · July 1, 2019 We consider the statistical connection between the quantized representation of a high dimensional signal X using a random spherical code and the observation of X under an additive white Gaussian noise (AWGN). We show that given X, the conditional Wasserste ... Full text Cite

The Replica-Symmetric Prediction for Random Linear Estimation With Gaussian Matrices Is Exact

Journal Article IEEE Transactions on Information Theory · April 1, 2019 This paper considers the fundamental limit of random linear estimation for i.i.d. signal distributions and i.i.d. Gaussian measurement matrices. Its main contribution is a rigorous characterization of the asymptotic mutual information (MI) and minimum mean ... Full text Cite

Adversarially learned representations for information obfuscation and inference

Conference 36th International Conference on Machine Learning, ICML 2019 · January 1, 2019 Data collection and sharing are pervasive aspects of modern society. This process can either be voluntary, as in the case of a person taking a facial image to unlock his/her phone, or incidental, such as traffic cameras collecting videos on pedestrians. An ... Cite

Reed-Muller Codes on BMS Channels Achieve Vanishing Bit-Error Probability for all Rates Below Capacity

Journal Article IEEE Transactions on Information Theory · February 1, 2024 This paper considers the performance of Reed-Muller (RM) codes transmitted over binary memoryless symmetric (BMS) channels under bitwise maximum-a-posteriori (bit-MAP) decoding. Its main result is that, for a fixed BMS channel, the family of binary RM code ... Full text Cite

Achieving Capacity on Non-Binary Channels with Generalized Reed-Muller Codes

Conference IEEE International Symposium on Information Theory - Proceedings · January 1, 2023 Recently, the authors showed that Reed-Muller (RM) codes achieve capacity on binary memoryless symmetric (BMS) channels with respect to bit error rate. This paper extends that work by showing that RM codes defined on non-binary fields, known as generalized ... Full text Cite

Erratum: Approximating posteriors with high-dimensional nuisance parameters via integrated rotated Gaussian approximation (Biometrika (2021) 108 (269-282) DOI: 10.1093/biomet/asaa068)

Journal Article Biometrika · March 1, 2022 In the main paper under subsection -3.2. Bayesian variable selection-, all references to -5.2- should read: -3.1-. Under subsection -5.2. Bayesian variable selection-, the reference to -5.3 and 6- should read: -S5.3 and S6-. These errors have now been corr ... Full text Cite

Fundamental limits for rank-one matrix estimation with groupwise heteroskedasticity

Conference Proceedings of Machine Learning Research · January 1, 2022 Low-rank matrix recovery problems involving high-dimensional and heterogeneous data appear in applications throughout statistics and machine learning. The contribution of this paper is to establish the fundamental limits of recovery for a broad class of th ... Cite

k-Sliced Mutual Information: A Quantitative Study of Scalability with Dimension

Conference Advances in Neural Information Processing Systems · January 1, 2022 Sliced mutual information (SMI) is defined as an average of mutual information (MI) terms between one-dimensional random projections of the random variables. It serves as a surrogate measure of dependence to classic MI that preserves many of its properties ... Cite

Gaussian Approximation of Quantization Error for Estimation from Compressed Data

Journal Article IEEE Transactions on Information Theory · August 1, 2021 We consider the distributional connection between the lossy compressed representation of a high-dimensional signal $X$ using a random spherical code and the observation of $X$ under an additive white Gaussian noise (AWGN). We show that the Wasserstein dist ... Full text Cite

Approximating posteriors with high-dimensional nuisance parameters via integrated rotated Gaussian approximation.

Journal Article Biometrika · June 2021 Posterior computation for high-dimensional data with many parameters can be challenging. This article focuses on a new method for approximating posterior distributions of a low- to moderate-dimensional parameter in the presence of a high-dimensional or oth ... Full text Cite

Convergence of Gaussian-smoothed optimal transport distance with sub-gamma distributions and dependent samples

Conference Proceedings of Machine Learning Research · January 1, 2021 The Gaussian-smoothed optimal transport (GOT) framework, recently proposed by Goldfeld et al., scales to high dimensions in estimation and provides an alternative to entropy regularization. This paper provides convergence guarantees for estimating the GOT ... Cite

The Gaussian equivalence of generative models for learning with shallow neural networks

Conference Proceedings of Machine Learning Research · January 1, 2021 Understanding the impact of data structure on the computational tractability of learning is a key challenge for the theory of neural networks. Many theoretical works do not explicitly model training data, or assume that inputs are drawn component-wise inde ... Cite

A Two-Moment Inequality with Applications to Rényi Entropy and Mutual Information.

Journal Article Entropy (Basel, Switzerland) · November 2020 This paper explores some applications of a two-moment inequality for the integral of the rth power of a function, where 0 Full text Cite

Information-theoretic limits for the matrix tensor product

Journal Article IEEE Journal on Selected Areas in Information Theory · November 1, 2020 This article studies a high-dimensional inference problem involving the matrix tensor product of random matrices. This problem generalizes a number of contemporary data science problems including the spiked matrix models used in sparse principal component ... Full text Cite

Information-theoretic limits of a multiview low-rank symmetric spiked matrix model

Conference IEEE International Symposium on Information Theory - Proceedings · June 1, 2020 We consider a generalization of an important class of high-dimensional inference problems, namely spiked symmetric matrix models, often used as probabilistic models for principal component analysis. Such paradigmatic models have recently attracted a lot of ... Full text Cite

The all-or-nothing phenomenon in sparse linear regression

Journal Article Mathematical Statistics and Learning · January 1, 2020 We study the problem of recovering a hidden binary k-sparse p-dimensional vector β from n noisy linear observations Y = Xβ + W, where Xij are i.i.d. N (0,1) and Wi are i.i.d. N (0, σ2). A closely related hypothesis testing problem is to distinguish the pai ... Full text Cite

Gaussian Mixture Models for Stochastic Block Models with Non-Vanishing Noise

Journal Article 2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2019 - Proceedings · December 1, 2019 Community detection tasks have received a lot of attention across statistics, machine learning, and information theory with work concentrating on providing theoretical guarantees for different methodological approaches to the stochastic block model. Recent ... Full text Cite

All-or-Nothing Phenomena: From Single-Letter to High Dimensions

Conference 2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2019 - Proceedings · December 1, 2019 We consider the problem of estimating a p -dimensional vector beta from n observations Y=Xbeta+W, where beta-jmathopsimmathrmi.i.d.pi for a real-valued distribution pi with zero mean and unit variance' X-ijmathopsimmathrmi.i.d.mathcalN(0,1), and W-imathops ... Full text Cite

Mutual Information in Community Detection with Covariate Information and Correlated Networks

Conference 2019 57th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2019 · September 1, 2019 We study the problem of community detection when there is covariate information about the node labels and one observes multiple correlated networks. We provide an asymptotic upper bound on the per-node mutual information as well as a heuristic analysis of ... Full text Cite

The Geometry of Community Detection via the MMSE Matrix

Journal Article IEEE International Symposium on Information Theory - Proceedings · July 1, 2019 The information-theoretic limits of community detection have been studied extensively for network models with high levels of symmetry or homogeneity. The contribution of this paper is to study a broader class of network models that allow for variability in ... Full text Cite

Gaussian Approximation of Quantization Error for Estimation from Compressed Data

Conference IEEE International Symposium on Information Theory - Proceedings · July 1, 2019 We consider the statistical connection between the quantized representation of a high dimensional signal X using a random spherical code and the observation of X under an additive white Gaussian noise (AWGN). We show that given X, the conditional Wasserste ... Full text Cite

The Replica-Symmetric Prediction for Random Linear Estimation With Gaussian Matrices Is Exact

Journal Article IEEE Transactions on Information Theory · April 1, 2019 This paper considers the fundamental limit of random linear estimation for i.i.d. signal distributions and i.i.d. Gaussian measurement matrices. Its main contribution is a rigorous characterization of the asymptotic mutual information (MI) and minimum mean ... Full text Cite

Adversarially learned representations for information obfuscation and inference

Conference 36th International Conference on Machine Learning, ICML 2019 · January 1, 2019 Data collection and sharing are pervasive aspects of modern society. This process can either be voluntary, as in the case of a person taking a facial image to unlock his/her phone, or incidental, such as traffic cameras collecting videos on pedestrians. An ... Cite

The All-or-Nothing Phenomenon in Sparse Linear Regression

Conference Proceedings of Machine Learning Research · January 1, 2019 We study the problem of recovering a hidden binary k-sparse p-dimensional vector β from n noisy linear observations Y = Xβ + W where Xij are i.i.d. N(0, 1) and Wi are i.i.d. N(0, σ2). A closely related hypothesis testing problem is to distinguish the pair ... Cite

Mutual Information as a Function of Matrix SNR for Linear Gaussian Channels

Conference IEEE International Symposium on Information Theory - Proceedings · August 15, 2018 This paper focuses on the mutual information and minimum mean-squared error (MMSE) as a function a matrix-valued signal-to-noise ratio (SNR) for a linear Gaussian channel with arbitrary input distribution. As shown by Lamarca, the mutual-information is a c ... Full text Cite

Single Letter Formulas for Quantized Compressed Sensing with Gaussian Codebooks

Conference IEEE International Symposium on Information Theory - Proceedings · August 15, 2018 Theoretical and experimental results have shown that compressed sensing with quantization can perform well if the signal is very sparse, the noise is very low, and the bitrate is sufficiently large. However, a precise characterization of the fundamental tr ... Full text Cite

Compressed sensing under optimal quantization

Conference IEEE International Symposium on Information Theory - Proceedings · August 9, 2017 We consider the problem of recovering a sparse vector from a quantized or a lossy compressed version of its noisy random linear projections. We characterize the minimal distortion in this recovery as a function of the sampling ratio, the sparsity rate, the ... Full text Cite

Two-moment inequalities for Rényi entropy and mutual information

Conference IEEE International Symposium on Information Theory - Proceedings · August 9, 2017 This paper explores some applications of a two-moment inequality for the integral of the r-th power of a function, where 0 < r < 1. The first contribution is an upper bound on the Rényi entropy of a random vector in terms of the two different moments. When ... Full text Cite

Conditional central limit theorems for Gaussian projections

Conference IEEE International Symposium on Information Theory - Proceedings · August 9, 2017 This paper addresses the question of when projections of a high-dimensional random vector are approximately Gaussian. This problem has been studied previously in the context of high-dimensional data analysis, where the focus is on low-dimensional projectio ... Full text Cite

Optimizing the stimulus presentation paradigm design for the P300-based brain-computer interface using performance prediction.

Journal Article Journal of neural engineering · August 2017 ObjectiveThe role of a brain-computer interface (BCI) is to discern a user's intended message or action by extracting and decoding relevant information from brain signals. Stimulus-driven BCIs, such as the P300 speller, rely on detecting event-rel ... Full text Cite

Additivity of information in multilayer networks via additive Gaussian noise transforms

Conference 55th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2017 · July 1, 2017 Multilayer (or deep) networks are powerful probabilistic models based on multiple stages of a linear transform followed by a non-linear (possibly random) function. In general, the linear transforms are defined by matrices and the nonlinear functions are de ... Full text Cite

A performance-based approach to designing the stimulus presentation paradigm for the P300-based BCI by exploiting coding theory

Conference ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings · June 16, 2017 The P300-based brain-computer interface (BCI) speller relies on eliciting and detecting specific brain responses to target stimulus events, termed event-related potentials (ERPs). In a visual speller, ERPs are elicited when the user's desired character, i. ... Full text Cite

Modeling the P300-based brain-computer interface as a channel with memory

Conference 54th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2016 · February 10, 2017 The P300 speller is a brain-computer interface that enables people with severe neuromuscular disorders to communicate. It is based on eliciting and detecting event-related potentials (ERP) in electroencephalography (EEG) measurements, in response to rare t ... Full text Cite

Classification and Reconstruction of High-Dimensional Signals from Low-Dimensional Features in the Presence of Side Information

Conference IEEE Transactions on Information Theory · November 1, 2016 This paper offers a characterization of fundamental limits on the classification and reconstruction of high-dimensional signals from low-dimensional features, in the presence of side information. We consider a scenario where a decoder has access both to li ... Full text Cite

The replica-symmetric prediction for compressed sensing with Gaussian matrices is exact

Conference IEEE International Symposium on Information Theory - Proceedings · August 10, 2016 This paper considers the fundamental limit of compressed sensing for i.i.d. signal distributions and i.i.d. Gaussian measurement matrices. Its main contribution is a rigorous characterization of the asymptotic mutual information (MI) and minimum mean-squar ... Full text Cite

Performance assessment of image translation-engineered point spread functions

Conference Optics InfoBase Conference Papers · July 18, 2016 We demonstrate image translation, a general method for task-dependent point spread function engineering. Here, we compare the optical performance of variations of image translation with several well known imaging methods. © OSA 2016. ... Full text Cite

Classification and reconstruction of compressed GMM signals with side information

Conference IEEE International Symposium on Information Theory - Proceedings · September 28, 2015 This paper offers a characterization of performance limits for classification and reconstruction of high-dimensional signals from noisy compressive measurements, in the presence of side information. We assume the signal of interest and the side information ... Full text Cite

Quantifying uncertainty in variable selection with arbitrary matrices

Conference 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2015 · January 1, 2015 Probabilistically quantifying uncertainty in parameters, predictions and decisions is a crucial component of broad scientific and engineering applications. This is however difficult if the number of parameters far exceeds the sample size. Although there ar ... Full text Cite

The fundamental limits of stable recovery in compressed sensing

Journal Article IEEE International Symposium on Information Theory - Proceedings · January 1, 2014 Compressed sensing has shown that a wide variety of structured signals can be recovered from a limited number of noisy linear measurements. This paper considers the extent to which such recovery is robust to signal and measurement uncertainty. The main res ... Full text Cite

Beyond sparsity: Universally stable compressed sensing when the number of 'free' values is less than the number of observations

Journal Article 2013 5th IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2013 · December 1, 2013 Recent results in compressed sensing have shown that a wide variety of structured signals can be recovered from undersampled and noisy linear observations. In this paper, we show that many of these signal structures can be modeled using an union of affine ... Full text Cite

Approximate sparsity pattern recovery: Information-theoretic lower bounds

Journal Article IEEE Transactions on Information Theory · May 23, 2013 Recovery of the sparsity pattern (or support) of an unknown sparse vector from a small number of noisy linear measurements is an important problem in compressed sensing. In this paper, the high-dimensional setting is considered. It is shown that if the mea ... Full text Cite

The minimax noise sensitivity in compressed sensing

Journal Article IEEE International Symposium on Information Theory - Proceedings · January 1, 2013 Consider the compressed sensing problem of estimating an unknown k-sparse n-vector from a set of m noisy linear equations. Recent work focused on the noise sensitivity of particular algorithms - the scaling of the reconstruction error with added noise. In ... Full text Cite

Achieving Bayes MMSE performance in the sparse signal + Gaussian white noise model when the noise level is unknown

Journal Article IEEE International Symposium on Information Theory - Proceedings · January 1, 2013 Recent work on Approximate Message Passing algorithms in compressed sensing focuses on 'ideal' algorithms which at each iteration face a subproblem of recovering an unknown sparse signal in Gaussian white noise. The noise level in each subproblem changes f ... Full text Cite

Compressed sensing phase transitions: Rigorous bounds versus replica predictions

Journal Article 2012 46th Annual Conference on Information Sciences and Systems, CISS 2012 · November 12, 2012 In recent work, two different methods have been used to characterize the fundamental limits of compressed sensing. On the one hand are rigorous bounds based on information-theoretic arguments or the analysis of specific algorithms. On the other hand are ex ... Full text Cite

The sensitivity of compressed sensing performance to relaxation of sparsity

Journal Article IEEE International Symposium on Information Theory - Proceedings · October 22, 2012 Many papers studying compressed sensing consider the noisy underdetermined system of linear equations: y = Ax 0+ z, with n x N measurement matrix A, n < N, and Gaussian white noise z ∼ N(0,σ 2 I). Both y and A are known, both x 0 and z are unknown, and we ... Full text Cite

The sampling rate-distortion tradeoff for sparsity pattern recovery in compressed sensing

Journal Article IEEE Transactions on Information Theory · May 1, 2012 Recovery of the sparsity pattern (or support) of an unknown sparse vector from a limited number of noisy linear measurements is an important problem in compressed sensing. In the high-dimensional setting, it is known that recovery with a vanishing fraction ... Full text Cite

A compressed sensing wire-tap channel

Journal Article 2011 IEEE Information Theory Workshop, ITW 2011 · December 21, 2011 A multiplicative Gaussian wire-tap channel inspired by compressed sensing is studied. Lower and upper bounds on the secrecy capacity are derived, and shown to be relatively tight in the large system limit for a large class of compressed sensing matrices. S ... Full text Cite

On the role of diversity in sparsity estimation

Journal Article IEEE International Symposium on Information Theory - Proceedings · October 26, 2011 A major challenge in sparsity pattern estimation is that small modes are difficult to detect in the presence of noise. This problem is alleviated if one can observe samples from multiple realizations of the nonzero values for the same sparsity pattern. We ... Full text Cite

"Compressed" compressed sensing

Journal Article IEEE International Symposium on Information Theory - Proceedings · August 23, 2010 The field of compressed sensing has shown that a sparse but otherwise arbitrary vector can be recovered exactly from a small number of randomly constructed linear projections (or samples). The question addressed in this paper is whether an even smaller num ... Full text Cite

A note on optimal support recovery in compressed sensing

Journal Article Conference Record - Asilomar Conference on Signals, Systems and Computers · December 1, 2009 Recovery of the support set (or sparsity pattern) of a sparse vector from a small number of noisy linear projections (or samples) is a "compressed sensing" problem that arises in signal processing and statistics. Although many computationally efficient rec ... Full text Cite

Managing massive time series streams with multi-scale compressed trickles

Journal Article Proceedings of the VLDB Endowment · January 1, 2009 We present Cypress, a novel framework to archive and query massive time series streams such as those generated by sensor networks, data centers, and scientific computing. Cypress applies multi-scale analysis to decompose time series and to obtain sparse re ... Full text Cite

Sampling bounds for sparse support recovery in the presence of noise

Journal Article IEEE International Symposium on Information Theory - Proceedings · September 29, 2008 It is well known that the support of a sparse signal can be recovered from a small number of random projections. However, in the presence of noise all known sufficient conditions require that the per-sample signal-to-noise ratio (SNR) grows without bound w ... Full text Cite

Differences between observation and sampling error in sparse signal reconstruction

Journal Article IEEE Workshop on Statistical Signal Processing Proceedings · December 1, 2007 The field of Compressed Sensing has shown that a relatively small number of random projections provide sufficient information to accurately reconstruct sparse signals. Inspired by applications in sensor networks in which each sensor is likely to observe a ... Full text Cite

Information Theoretic Analysis of the Impact of Refractory Effects on the P300 Speller

Conference The P300 speller is a brain-computer interface that enables people with severe neuromuscular disorders to communicate based on eliciting and detecting event-related potentials (ERP) in electroencephalography (EEG) measurements, in response to rare target s ... Link to item Cite