Skip to main content

Iteratively Re-weighted Least Squares minimization: Proof of faster than linear rate for sparse recovery

Publication ,  Journal Article
Daubechies, I; DeVore, R; Fornasier, M; Güntürk, S
Published in: CISS 2008, The 42nd Annual Conference on Information Sciences and Systems
September 24, 2008

Given an m × N matrix Φ, with m < N, the system of equations Φx = y is typically underdetermined and has infinitely many solutions. Various forms of optimization can extract a "best" solution. One of the oldest is to select the one with minimal l2 norm. It has been shown that in many applications a better choice is the minimal l1 norm solution. This is the case in Compressive Sensing, when sparse solutions are sought. The minimal l1 norm solution can be found by using linear programming; an alternative method is Iterative Re-weighted Least Squares (IRLS), which in some cases is numerically faster. The main step of IRLS finds, for a given weight w, the solution with smallest l2(w) norm; this weight is updated at every iteration step: if x(n) is the solution at step n, then w(n) is defined by wi(n):= 1/|xi(n)|, i = 1,..., N. We give a specific recipe for updating weights that avoids technical shortcomings in other approaches, and for which we can prove convergence under certain conditions on the matrix Φ known as the Restricted Isometry Property. We also show that if there is a sparse solution, then the limit of the proposed algorithm is that sparse solution. It is also shown that whenever the solution at a given iteration is sufficiently close to the limit, then the remaining steps of the algorithm converge exponentially fast. In the standard version of the algorithm, designed to emulate l1-minimization, the exponenital rate is linear; in adapted versions aimed at 1T-minimization with T <1, we prove faster than linear rate. © 2008 IEEE.

Duke Scholars

Published In

CISS 2008, The 42nd Annual Conference on Information Sciences and Systems

DOI

Publication Date

September 24, 2008

Start / End Page

26 / 29
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Daubechies, I., DeVore, R., Fornasier, M., & Güntürk, S. (2008). Iteratively Re-weighted Least Squares minimization: Proof of faster than linear rate for sparse recovery. CISS 2008, The 42nd Annual Conference on Information Sciences and Systems, 26–29. https://doi.org/10.1109/CISS.2008.4558489
Daubechies, I., R. DeVore, M. Fornasier, and S. Güntürk. “Iteratively Re-weighted Least Squares minimization: Proof of faster than linear rate for sparse recovery.” CISS 2008, The 42nd Annual Conference on Information Sciences and Systems, September 24, 2008, 26–29. https://doi.org/10.1109/CISS.2008.4558489.
Daubechies I, DeVore R, Fornasier M, Güntürk S. Iteratively Re-weighted Least Squares minimization: Proof of faster than linear rate for sparse recovery. CISS 2008, The 42nd Annual Conference on Information Sciences and Systems. 2008 Sep 24;26–9.
Daubechies, I., et al. “Iteratively Re-weighted Least Squares minimization: Proof of faster than linear rate for sparse recovery.” CISS 2008, The 42nd Annual Conference on Information Sciences and Systems, Sept. 2008, pp. 26–29. Scopus, doi:10.1109/CISS.2008.4558489.
Daubechies I, DeVore R, Fornasier M, Güntürk S. Iteratively Re-weighted Least Squares minimization: Proof of faster than linear rate for sparse recovery. CISS 2008, The 42nd Annual Conference on Information Sciences and Systems. 2008 Sep 24;26–29.

Published In

CISS 2008, The 42nd Annual Conference on Information Sciences and Systems

DOI

Publication Date

September 24, 2008

Start / End Page

26 / 29