Decentralized learning scenarios offer the opportu- nity of a flexible cooperation between learning nodes; in other words, each node may cooperate with an arbitrary subset of its peers. In such scenarios, we tackle the problem of choosing the nodes that cooperate towards the training of a machine learning model, hence, tweaking the cooperation graph connecting the nodes themselves. We propose and evaluate a data-driven approach to the problem, by proposing three metrics to choose the edges to activate in the cooperation graph, and an efficient iterative algorithm exploiting them. Through our performance evaluation, which leverages state-of-the-art datasets and neural network architectures, we find that privacy-preserving metrics accounting for the difference between local datasets are very effective in identifying the best edges to activate to improve the efficiency of model training without hurting performance.

Data-Driven and Privacy-Preserving Cooperation in Decentralized Learning / Malandrino, Francesco; BARROSO FERNANDEZ, Carlos; Bernardos Cano, Carlos J.; Chiasserini, Carla Fabiana; De La Oliva, Antonio; Onsori, Mahyar. - ELETTRONICO. - (2024). (Intervento presentato al convegno IEEE LCN 2024 tenutosi a Caen (Fra) nel 8-10 October 2024) [10.1109/LCN60385.2024.10639653].

Data-Driven and Privacy-Preserving Cooperation in Decentralized Learning

Carlos Barroso Fernandez;Carla Fabiana Chiasserini;Mahyar Onsori
2024

Abstract

Decentralized learning scenarios offer the opportu- nity of a flexible cooperation between learning nodes; in other words, each node may cooperate with an arbitrary subset of its peers. In such scenarios, we tackle the problem of choosing the nodes that cooperate towards the training of a machine learning model, hence, tweaking the cooperation graph connecting the nodes themselves. We propose and evaluate a data-driven approach to the problem, by proposing three metrics to choose the edges to activate in the cooperation graph, and an efficient iterative algorithm exploiting them. Through our performance evaluation, which leverages state-of-the-art datasets and neural network architectures, we find that privacy-preserving metrics accounting for the difference between local datasets are very effective in identifying the best edges to activate to improve the efficiency of model training without hurting performance.
File in questo prodotto:
File Dimensione Formato  
1571038585 paper (1).pdf

accesso aperto

Tipologia: 2. Post-print / Author's Accepted Manuscript
Licenza: Pubblico - Tutti i diritti riservati
Dimensione 351.94 kB
Formato Adobe PDF
351.94 kB Adobe PDF Visualizza/Apri
Data-Driven_and_Privacy-Preserving_Cooperation_in_Decentralized_Learning.pdf

accesso riservato

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 2.27 MB
Formato Adobe PDF
2.27 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2991927