Distributed optimization in multi-agent systems under sparsity constraints has recently received a lot of attention. In this paper, we consider the in-network minimization of a continuously differentiable nonlinear function which is a combination of local agent objective functions subject to sparsity constraints on the variables. A crucial issue of in-network optimization is the handling of the communications, which may be expensive. This calls for efficient algorithms, that are able to reduce the number of required communication links and transmitted messages. To this end, we focus on asynchronous and randomized distributed techniques. Based on consensus techniques and iterative hard thresholding methods, we propose three methods that attempt to minimize the given function, promoting sparsity of the solution: asynchronous hard thresholding (AHT), broadcast hard thresholding (BHT), and gossip hard thresholding (GHT). Although similar in many aspects, it is difficult to obtain a unified analysis for the proposed algorithms. Specifically, we theoretically prove the convergence and characterize the limit points of AHT in regular networks under some proper assumptions on the functions to be minimized. For BHT and GHT, instead, we characterize the fixed points of the maps that rule their dynamics in terms of stationary points of original problem. Finally, we illustrate the implementation of our techniques in compressed sensing and present several numerical results on performance and number of transmissions required for convergence.
Randomized Algorithms for Distributed Nonlinear Optimization Under Sparsity Constraints / Ravazzi, Chiara; Fosson, Sophie; Magli, Enrico. - In: IEEE TRANSACTIONS ON SIGNAL PROCESSING. - ISSN 1053-587X. - 64:6(2016), pp. 1420-1434. [10.1109/TSP.2015.2500887]
Randomized Algorithms for Distributed Nonlinear Optimization Under Sparsity Constraints
RAVAZZI, CHIARA;FOSSON, SOPHIE;MAGLI, ENRICO
2016
Abstract
Distributed optimization in multi-agent systems under sparsity constraints has recently received a lot of attention. In this paper, we consider the in-network minimization of a continuously differentiable nonlinear function which is a combination of local agent objective functions subject to sparsity constraints on the variables. A crucial issue of in-network optimization is the handling of the communications, which may be expensive. This calls for efficient algorithms, that are able to reduce the number of required communication links and transmitted messages. To this end, we focus on asynchronous and randomized distributed techniques. Based on consensus techniques and iterative hard thresholding methods, we propose three methods that attempt to minimize the given function, promoting sparsity of the solution: asynchronous hard thresholding (AHT), broadcast hard thresholding (BHT), and gossip hard thresholding (GHT). Although similar in many aspects, it is difficult to obtain a unified analysis for the proposed algorithms. Specifically, we theoretically prove the convergence and characterize the limit points of AHT in regular networks under some proper assumptions on the functions to be minimized. For BHT and GHT, instead, we characterize the fixed points of the maps that rule their dynamics in terms of stationary points of original problem. Finally, we illustrate the implementation of our techniques in compressed sensing and present several numerical results on performance and number of transmissions required for convergence.File | Dimensione | Formato | |
---|---|---|---|
double_final.pdf
accesso aperto
Tipologia:
2. Post-print / Author's Accepted Manuscript
Licenza:
PUBBLICO - Tutti i diritti riservati
Dimensione
946.98 kB
Formato
Adobe PDF
|
946.98 kB | Adobe PDF | Visualizza/Apri |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2642956
Attenzione
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo