In this paper, we propose a new class of iteratively re-weighted least squares (IRLS) for sparse recovery problems. The proposed methods are inspired by constrained maximum-likelihood estimation under a Gaussian scale mixture (GSM) distribution assumption. In the noise-free setting, we provide sufficient conditions ensuring the convergence of the sequences generated by these algorithms to the set of fixed points of the maps that rule their dynamics and derive conditions verifiable a posteriori for the convergence to a sparse solution. We further prove that these algorithms are quadratically fast in a neighborhood of a sparse solution. We show through numerical experiments that the proposed methods outperform classical IRLS for l_p-minimization with p\in(0,1] in terms of speed and of sparsity-undersampling tradeoff and are robust even in presence of noise. The simplicity and the theoretical guarantees provided in this paper make this class of algorithms an attractive solution for sparse recovery problems.
Gaussian Mixtures Based IRLS for Sparse Recovery With Quadratic Convergence / Ravazzi, Chiara; Magli, Enrico. - In: IEEE TRANSACTIONS ON SIGNAL PROCESSING. - ISSN 1053-587X. - STAMPA. - 63:13(2015), pp. 3474-3489. [10.1109/TSP.2015.2428216]
Gaussian Mixtures Based IRLS for Sparse Recovery With Quadratic Convergence
RAVAZZI, CHIARA;MAGLI, ENRICO
2015
Abstract
In this paper, we propose a new class of iteratively re-weighted least squares (IRLS) for sparse recovery problems. The proposed methods are inspired by constrained maximum-likelihood estimation under a Gaussian scale mixture (GSM) distribution assumption. In the noise-free setting, we provide sufficient conditions ensuring the convergence of the sequences generated by these algorithms to the set of fixed points of the maps that rule their dynamics and derive conditions verifiable a posteriori for the convergence to a sparse solution. We further prove that these algorithms are quadratically fast in a neighborhood of a sparse solution. We show through numerical experiments that the proposed methods outperform classical IRLS for l_p-minimization with p\in(0,1] in terms of speed and of sparsity-undersampling tradeoff and are robust even in presence of noise. The simplicity and the theoretical guarantees provided in this paper make this class of algorithms an attractive solution for sparse recovery problems.File | Dimensione | Formato | |
---|---|---|---|
FullText_RavazziTransSP2015.pdf
accesso aperto
Descrizione: postprint - author version
Tipologia:
2. Post-print / Author's Accepted Manuscript
Licenza:
PUBBLICO - Tutti i diritti riservati
Dimensione
753.86 kB
Formato
Adobe PDF
|
753.86 kB | Adobe PDF | Visualizza/Apri |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2617379
Attenzione
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo