Learning sparse models from data is an important task in all those frameworks where relevant information should be identified within a large dataset. This can be achieved by formulating and solving suitable sparsity promoting optimization problems. As to linear regression models, Lasso is the most popular convex approach, based on an L1-norm regularization. In contrast, in this paper, we analyse a concave regularized approach, and we prove that it relaxes the irrepresentable condition, which is sufficient and essentially necessary for Lasso to select the right significant parameters. In practice, this has the benefit of reducing the number of necessary measurements with respect to Lasso. Since the proposed problem is nonconvex, we also discuss different algorithms to solve it, and we illustrate the obtained enhancement via numerical experiments.
Sparse learning with concave regularization: relaxation of the irrepresentable condition / Cerone, V.; Fosson, S.; Regruto, D.; Salam, A.. - ELETTRONICO. - (2020), pp. 396-401. (Intervento presentato al convegno 59th IEEE Conference on Decision and Control (CDC) tenutosi a Jeju Island (Republic of Korea) nel December 14-18, 2020) [10.1109/CDC42340.2020.9304508].
Sparse learning with concave regularization: relaxation of the irrepresentable condition
V. Cerone;S. Fosson;D. Regruto;A. Salam
2020
Abstract
Learning sparse models from data is an important task in all those frameworks where relevant information should be identified within a large dataset. This can be achieved by formulating and solving suitable sparsity promoting optimization problems. As to linear regression models, Lasso is the most popular convex approach, based on an L1-norm regularization. In contrast, in this paper, we analyse a concave regularized approach, and we prove that it relaxes the irrepresentable condition, which is sufficient and essentially necessary for Lasso to select the right significant parameters. In practice, this has the benefit of reducing the number of necessary measurements with respect to Lasso. Since the proposed problem is nonconvex, we also discuss different algorithms to solve it, and we illustrate the obtained enhancement via numerical experiments.File | Dimensione | Formato | |
---|---|---|---|
CDC2020_v3.0.pdf
accesso aperto
Tipologia:
2. Post-print / Author's Accepted Manuscript
Licenza:
PUBBLICO - Tutti i diritti riservati
Dimensione
312.03 kB
Formato
Adobe PDF
|
312.03 kB | Adobe PDF | Visualizza/Apri |
09304508.pdf
non disponibili
Tipologia:
2a Post-print versione editoriale / Version of Record
Licenza:
Non Pubblico - Accesso privato/ristretto
Dimensione
359.27 kB
Formato
Adobe PDF
|
359.27 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2846133