Journal ArticleIEEE Transactions on Information Theory · February 1, 2024
This paper considers the performance of Reed-Muller (RM) codes transmitted over binary memoryless symmetric (BMS) channels under bitwise maximum-a-posteriori (bit-MAP) decoding. Its main result is that, for a fixed BMS channel, the family of binary RM code ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · January 1, 2024
This paper introduces a framework for approximate message passing (AMP) in dynamic settings where the data at each iteration is passed through a linear operator. This framework is motivated in part by applications in large-scale, distributed computing wher ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · January 1, 2023
Recently, the authors showed that Reed-Muller (RM) codes achieve capacity on binary memoryless symmetric (BMS) channels with respect to bit error rate. This paper extends that work by showing that RM codes defined on non-binary fields, known as generalized ...
Full textCite
Journal ArticleBiometrika · March 1, 2022
In the main paper under subsection -3.2. Bayesian variable selection-, all references to -5.2- should read: -3.1-. Under subsection -5.2. Bayesian variable selection-, the reference to -5.3 and 6- should read: -S5.3 and S6-. These errors have now been corr ...
Full textCite
ConferenceProceedings of Machine Learning Research · January 1, 2022
Low-rank matrix recovery problems involving high-dimensional and heterogeneous data appear in applications throughout statistics and machine learning. The contribution of this paper is to establish the fundamental limits of recovery for a broad class of th ...
Cite
ConferenceAdvances in Neural Information Processing Systems · January 1, 2022
Sliced mutual information (SMI) is defined as an average of mutual information (MI) terms between one-dimensional random projections of the random variables. It serves as a surrogate measure of dependence to classic MI that preserves many of its properties ...
Cite
Journal ArticleIEEE Transactions on Information Theory · August 1, 2021
We consider the distributional connection between the lossy compressed representation of a high-dimensional signal $X$ using a random spherical code and the observation of $X$ under an additive white Gaussian noise (AWGN). We show that the Wasserstein dist ...
Full textCite
Journal ArticleBiometrika · June 2021
Posterior computation for high-dimensional data with many parameters can be challenging. This article focuses on a new method for approximating posterior distributions of a low- to moderate-dimensional parameter in the presence of a high-dimensional or oth ...
Full textCite
ConferenceProceedings of Machine Learning Research · January 1, 2021
The Gaussian-smoothed optimal transport (GOT) framework, recently proposed by Goldfeld et al., scales to high dimensions in estimation and provides an alternative to entropy regularization. This paper provides convergence guarantees for estimating the GOT ...
Cite
ConferenceProceedings of Machine Learning Research · January 1, 2021
Understanding the impact of data structure on the computational tractability of learning is a key challenge for the theory of neural networks. Many theoretical works do not explicitly model training data, or assume that inputs are drawn component-wise inde ...
Cite
Chapter · January 1, 2021
The ability to understand and solve high-dimensional inference problems is essential for modern data science. This chapter examines high-dimensional inference problems through the lens of information theory and focuses on the standard linear model as a can ...
Full textCite
Journal ArticleEntropy (Basel, Switzerland) · November 2020
This paper explores some applications of a two-moment inequality for the integral of the rth power of a function, where 0
Full textCite
Journal ArticleIEEE Journal on Selected Areas in Information Theory · November 1, 2020
This article studies a high-dimensional inference problem involving the matrix tensor product of random matrices. This problem generalizes a number of contemporary data science problems including the spiked matrix models used in sparse principal component ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · June 1, 2020
We consider a generalization of an important class of high-dimensional inference problems, namely spiked symmetric matrix models, often used as probabilistic models for principal component analysis. Such paradigmatic models have recently attracted a lot of ...
Full textCite
Journal ArticleMathematical Statistics and Learning · January 1, 2020
We study the problem of recovering a hidden binary k-sparse p-dimensional vector β from n noisy linear observations Y = Xβ + W, where Xij are i.i.d. N (0,1) and Wi are i.i.d. N (0, σ2). A closely related hypothesis testing problem is to distinguish the pai ...
Full textCite
Journal Article2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2019 - Proceedings · December 1, 2019
Community detection tasks have received a lot of attention across statistics, machine learning, and information theory with work concentrating on providing theoretical guarantees for different methodological approaches to the stochastic block model. Recent ...
Full textCite
Conference2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2019 - Proceedings · December 1, 2019
We consider the problem of estimating a p -dimensional vector beta from n observations Y=Xbeta+W, where beta-jmathopsimmathrmi.i.d.pi for a real-valued distribution pi with zero mean and unit variance' X-ijmathopsimmathrmi.i.d.mathcalN(0,1), and W-imathops ...
Full textCite
Conference2019 57th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2019 · September 1, 2019
We study the problem of community detection when there is covariate information about the node labels and one observes multiple correlated networks. We provide an asymptotic upper bound on the per-node mutual information as well as a heuristic analysis of ...
Full textCite
Journal ArticleIEEE International Symposium on Information Theory - Proceedings · July 1, 2019
The information-theoretic limits of community detection have been studied extensively for network models with high levels of symmetry or homogeneity. The contribution of this paper is to study a broader class of network models that allow for variability in ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · July 1, 2019
We consider the statistical connection between the quantized representation of a high dimensional signal X using a random spherical code and the observation of X under an additive white Gaussian noise (AWGN). We show that given X, the conditional Wasserste ...
Full textCite
Journal ArticleIEEE Transactions on Information Theory · February 1, 2024
This paper considers the performance of Reed-Muller (RM) codes transmitted over binary memoryless symmetric (BMS) channels under bitwise maximum-a-posteriori (bit-MAP) decoding. Its main result is that, for a fixed BMS channel, the family of binary RM code ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · January 1, 2024
This paper introduces a framework for approximate message passing (AMP) in dynamic settings where the data at each iteration is passed through a linear operator. This framework is motivated in part by applications in large-scale, distributed computing wher ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · January 1, 2023
Recently, the authors showed that Reed-Muller (RM) codes achieve capacity on binary memoryless symmetric (BMS) channels with respect to bit error rate. This paper extends that work by showing that RM codes defined on non-binary fields, known as generalized ...
Full textCite
Journal ArticleBiometrika · March 1, 2022
In the main paper under subsection -3.2. Bayesian variable selection-, all references to -5.2- should read: -3.1-. Under subsection -5.2. Bayesian variable selection-, the reference to -5.3 and 6- should read: -S5.3 and S6-. These errors have now been corr ...
Full textCite
ConferenceProceedings of Machine Learning Research · January 1, 2022
Low-rank matrix recovery problems involving high-dimensional and heterogeneous data appear in applications throughout statistics and machine learning. The contribution of this paper is to establish the fundamental limits of recovery for a broad class of th ...
Cite
ConferenceAdvances in Neural Information Processing Systems · January 1, 2022
Sliced mutual information (SMI) is defined as an average of mutual information (MI) terms between one-dimensional random projections of the random variables. It serves as a surrogate measure of dependence to classic MI that preserves many of its properties ...
Cite
Journal ArticleIEEE Transactions on Information Theory · August 1, 2021
We consider the distributional connection between the lossy compressed representation of a high-dimensional signal $X$ using a random spherical code and the observation of $X$ under an additive white Gaussian noise (AWGN). We show that the Wasserstein dist ...
Full textCite
Journal ArticleBiometrika · June 2021
Posterior computation for high-dimensional data with many parameters can be challenging. This article focuses on a new method for approximating posterior distributions of a low- to moderate-dimensional parameter in the presence of a high-dimensional or oth ...
Full textCite
ConferenceProceedings of Machine Learning Research · January 1, 2021
The Gaussian-smoothed optimal transport (GOT) framework, recently proposed by Goldfeld et al., scales to high dimensions in estimation and provides an alternative to entropy regularization. This paper provides convergence guarantees for estimating the GOT ...
Cite
ConferenceProceedings of Machine Learning Research · January 1, 2021
Understanding the impact of data structure on the computational tractability of learning is a key challenge for the theory of neural networks. Many theoretical works do not explicitly model training data, or assume that inputs are drawn component-wise inde ...
Cite
Chapter · January 1, 2021
The ability to understand and solve high-dimensional inference problems is essential for modern data science. This chapter examines high-dimensional inference problems through the lens of information theory and focuses on the standard linear model as a can ...
Full textCite
Journal ArticleEntropy (Basel, Switzerland) · November 2020
This paper explores some applications of a two-moment inequality for the integral of the rth power of a function, where 0
Full textCite
Journal ArticleIEEE Journal on Selected Areas in Information Theory · November 1, 2020
This article studies a high-dimensional inference problem involving the matrix tensor product of random matrices. This problem generalizes a number of contemporary data science problems including the spiked matrix models used in sparse principal component ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · June 1, 2020
We consider a generalization of an important class of high-dimensional inference problems, namely spiked symmetric matrix models, often used as probabilistic models for principal component analysis. Such paradigmatic models have recently attracted a lot of ...
Full textCite
Journal ArticleMathematical Statistics and Learning · January 1, 2020
We study the problem of recovering a hidden binary k-sparse p-dimensional vector β from n noisy linear observations Y = Xβ + W, where Xij are i.i.d. N (0,1) and Wi are i.i.d. N (0, σ2). A closely related hypothesis testing problem is to distinguish the pai ...
Full textCite
Journal Article2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2019 - Proceedings · December 1, 2019
Community detection tasks have received a lot of attention across statistics, machine learning, and information theory with work concentrating on providing theoretical guarantees for different methodological approaches to the stochastic block model. Recent ...
Full textCite
Conference2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2019 - Proceedings · December 1, 2019
We consider the problem of estimating a p -dimensional vector beta from n observations Y=Xbeta+W, where beta-jmathopsimmathrmi.i.d.pi for a real-valued distribution pi with zero mean and unit variance' X-ijmathopsimmathrmi.i.d.mathcalN(0,1), and W-imathops ...
Full textCite
Conference2019 57th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2019 · September 1, 2019
We study the problem of community detection when there is covariate information about the node labels and one observes multiple correlated networks. We provide an asymptotic upper bound on the per-node mutual information as well as a heuristic analysis of ...
Full textCite
Journal ArticleIEEE International Symposium on Information Theory - Proceedings · July 1, 2019
The information-theoretic limits of community detection have been studied extensively for network models with high levels of symmetry or homogeneity. The contribution of this paper is to study a broader class of network models that allow for variability in ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · July 1, 2019
We consider the statistical connection between the quantized representation of a high dimensional signal X using a random spherical code and the observation of X under an additive white Gaussian noise (AWGN). We show that given X, the conditional Wasserste ...
Full textCite
Journal ArticleIEEE Transactions on Information Theory · April 1, 2019
This paper considers the fundamental limit of random linear estimation for i.i.d. signal distributions and i.i.d. Gaussian measurement matrices. Its main contribution is a rigorous characterization of the asymptotic mutual information (MI) and minimum mean ...
Full textCite
Conference36th International Conference on Machine Learning, ICML 2019 · January 1, 2019
Data collection and sharing are pervasive aspects of modern society. This process can either be voluntary, as in the case of a person taking a facial image to unlock his/her phone, or incidental, such as traffic cameras collecting videos on pedestrians. An ...
Cite
ConferenceProceedings of Machine Learning Research · January 1, 2019
We study the problem of recovering a hidden binary k-sparse p-dimensional vector β from n noisy linear observations Y = Xβ + W where Xij are i.i.d. N(0, 1) and Wi are i.i.d. N(0, σ2). A closely related hypothesis testing problem is to distinguish the pair ...
Cite
ConferenceIEEE International Symposium on Information Theory - Proceedings · August 15, 2018
This paper focuses on the mutual information and minimum mean-squared error (MMSE) as a function a matrix-valued signal-to-noise ratio (SNR) for a linear Gaussian channel with arbitrary input distribution. As shown by Lamarca, the mutual-information is a c ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · August 15, 2018
Theoretical and experimental results have shown that compressed sensing with quantization can perform well if the signal is very sparse, the noise is very low, and the bitrate is sufficiently large. However, a precise characterization of the fundamental tr ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · August 9, 2017
We consider the problem of recovering a sparse vector from a quantized or a lossy compressed version of its noisy random linear projections. We characterize the minimal distortion in this recovery as a function of the sampling ratio, the sparsity rate, the ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · August 9, 2017
This paper explores some applications of a two-moment inequality for the integral of the r-th power of a function, where 0 < r < 1. The first contribution is an upper bound on the Rényi entropy of a random vector in terms of the two different moments. When ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · August 9, 2017
This paper addresses the question of when projections of a high-dimensional random vector are approximately Gaussian. This problem has been studied previously in the context of high-dimensional data analysis, where the focus is on low-dimensional projectio ...
Full textCite
Journal ArticleJournal of neural engineering · August 2017
ObjectiveThe role of a brain-computer interface (BCI) is to discern a user's intended message or action by extracting and decoding relevant information from brain signals. Stimulus-driven BCIs, such as the P300 speller, rely on detecting event-rel ...
Full textCite
Conference55th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2017 · July 1, 2017
Multilayer (or deep) networks are powerful probabilistic models based on multiple stages of a linear transform followed by a non-linear (possibly random) function. In general, the linear transforms are defined by matrices and the nonlinear functions are de ...
Full textCite
ConferenceICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings · June 16, 2017
The P300-based brain-computer interface (BCI) speller relies on eliciting and detecting specific brain responses to target stimulus events, termed event-related potentials (ERPs). In a visual speller, ERPs are elicited when the user's desired character, i. ...
Full textCite
Conference54th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2016 · February 10, 2017
The P300 speller is a brain-computer interface that enables people with severe neuromuscular disorders to communicate. It is based on eliciting and detecting event-related potentials (ERP) in electroencephalography (EEG) measurements, in response to rare t ...
Full textCite
ConferenceIEEE Transactions on Information Theory · November 1, 2016
This paper offers a characterization of fundamental limits on the classification and reconstruction of high-dimensional signals from low-dimensional features, in the presence of side information. We consider a scenario where a decoder has access both to li ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · August 10, 2016
This paper considers the fundamental limit of compressed sensing for i.i.d. signal distributions and i.i.d. Gaussian measurement matrices. Its main contribution is a rigorous characterization of the asymptotic mutual information (MI) and minimum mean-squar ...
Full textCite
ConferenceIEEE International Symposium on Information Theory - Proceedings · September 28, 2015
This paper offers a characterization of performance limits for classification and reconstruction of high-dimensional signals from noisy compressive measurements, in the presence of side information. We assume the signal of interest and the side information ...
Full textCite
Conference2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2015 · January 1, 2015
Probabilistically quantifying uncertainty in parameters, predictions and decisions is a crucial component of broad scientific and engineering applications. This is however difficult if the number of parameters far exceeds the sample size. Although there ar ...
Full textCite
Journal ArticleIEEE International Symposium on Information Theory - Proceedings · January 1, 2014
Compressed sensing has shown that a wide variety of structured signals can be recovered from a limited number of noisy linear measurements. This paper considers the extent to which such recovery is robust to signal and measurement uncertainty. The main res ...
Full textCite
Journal Article2013 5th IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2013 · December 1, 2013
Recent results in compressed sensing have shown that a wide variety of structured signals can be recovered from undersampled and noisy linear observations. In this paper, we show that many of these signal structures can be modeled using an union of affine ...
Full textCite
Journal ArticleIEEE Transactions on Information Theory · May 23, 2013
Recovery of the sparsity pattern (or support) of an unknown sparse vector from a small number of noisy linear measurements is an important problem in compressed sensing. In this paper, the high-dimensional setting is considered. It is shown that if the mea ...
Full textCite
Journal ArticleIEEE International Symposium on Information Theory - Proceedings · January 1, 2013
Consider the compressed sensing problem of estimating an unknown k-sparse n-vector from a set of m noisy linear equations. Recent work focused on the noise sensitivity of particular algorithms - the scaling of the reconstruction error with added noise. In ...
Full textCite
Journal ArticleIEEE International Symposium on Information Theory - Proceedings · January 1, 2013
Recent work on Approximate Message Passing algorithms in compressed sensing focuses on 'ideal' algorithms which at each iteration face a subproblem of recovering an unknown sparse signal in Gaussian white noise. The noise level in each subproblem changes f ...
Full textCite
Journal Article2012 46th Annual Conference on Information Sciences and Systems, CISS 2012 · November 12, 2012
In recent work, two different methods have been used to characterize the fundamental limits of compressed sensing. On the one hand are rigorous bounds based on information-theoretic arguments or the analysis of specific algorithms. On the other hand are ex ...
Full textCite
Journal ArticleIEEE International Symposium on Information Theory - Proceedings · October 22, 2012
Many papers studying compressed sensing consider the noisy underdetermined system of linear equations: y = Ax 0+ z, with n x N measurement matrix A, n < N, and Gaussian white noise z ∼ N(0,σ 2 I). Both y and A are known, both x 0 and z are unknown, and we ...
Full textCite
Journal ArticleIEEE Transactions on Information Theory · May 1, 2012
Recovery of the sparsity pattern (or support) of an unknown sparse vector from a limited number of noisy linear measurements is an important problem in compressed sensing. In the high-dimensional setting, it is known that recovery with a vanishing fraction ...
Full textCite
Journal Article2011 IEEE Information Theory Workshop, ITW 2011 · December 21, 2011
A multiplicative Gaussian wire-tap channel inspired by compressed sensing is studied. Lower and upper bounds on the secrecy capacity are derived, and shown to be relatively tight in the large system limit for a large class of compressed sensing matrices. S ...
Full textCite
Journal ArticleIEEE International Symposium on Information Theory - Proceedings · October 26, 2011
A major challenge in sparsity pattern estimation is that small modes are difficult to detect in the presence of noise. This problem is alleviated if one can observe samples from multiple realizations of the nonzero values for the same sparsity pattern. We ...
Full textCite
Journal ArticleIEEE International Symposium on Information Theory - Proceedings · August 23, 2010
The field of compressed sensing has shown that a sparse but otherwise arbitrary vector can be recovered exactly from a small number of randomly constructed linear projections (or samples). The question addressed in this paper is whether an even smaller num ...
Full textCite
Journal ArticleConference Record - Asilomar Conference on Signals, Systems and Computers · December 1, 2009
Recovery of the support set (or sparsity pattern) of a sparse vector from a small number of noisy linear projections (or samples) is a "compressed sensing" problem that arises in signal processing and statistics. Although many computationally efficient rec ...
Full textCite
Journal ArticleProceedings of the VLDB Endowment · January 1, 2009
We present Cypress, a novel framework to archive and query massive time series streams such as those generated by sensor networks, data centers, and scientific computing. Cypress applies multi-scale analysis to decompose time series and to obtain sparse re ...
Full textCite
Journal ArticleIEEE International Symposium on Information Theory - Proceedings · September 29, 2008
It is well known that the support of a sparse signal can be recovered from a small number of random projections. However, in the presence of noise all known sufficient conditions require that the per-sample signal-to-noise ratio (SNR) grows without bound w ...
Full textCite
Journal ArticleIEEE Workshop on Statistical Signal Processing Proceedings · December 1, 2007
The field of Compressed Sensing has shown that a relatively small number of random projections provide sufficient information to accurately reconstruct sparse signals. Inspired by applications in sensor networks in which each sensor is likely to observe a ...
Full textCite
Conference
The P300 speller is a brain-computer interface
that enables people with severe neuromuscular disorders to
communicate based on eliciting and detecting event-related potentials
(ERP) in electroencephalography (EEG) measurements,
in response to rare target s ...
Link to itemCite