In Federated Learning (FL), network end-nodes use private data for the local training of classification models that are periodically synchronized with a remote server to update a global model. A key limitation in FL is the high network traffic required to achieve acceptable accuracy, particularly when each client has samples of only a few classes, a problem known as pathological labels skew. Under such constraints, local models become biased toward the available classes, and more synchronization rounds are needed to converge, saturating the communication budget. We address this issue with Draft & Refine, a new resource management strategy to optimize accuracy by dynamically controlling clients' participation based on their remaining budget. Our approach consists of two learning phases: a low-effort drafting phase with only a few clients selected for synchronization, followed by a refinement phase with increased client participation. The available data traffic is used as the control variable to set the duration of the two phases. The proposed strategy outperforms state-of-the-art FL schemes with up to 12.20 % higher accuracy within the same data traffic.

Draft & Refine: Efficient Resource Management in Federated Learning under Pathological Labels Skew / Malan, Erich; Peluso, Valentino; Calimera, Andrea; Macii, Enrico. - (2024), pp. 1-4. (Intervento presentato al convegno International Conference on Electronics Circuits and Systems (ICECS) 2024 tenutosi a Nancy (FRA) nel 18-20 November 2024) [10.1109/ICECS61496.2024.10849333].

Draft & Refine: Efficient Resource Management in Federated Learning under Pathological Labels Skew

Malan, Erich;Peluso, Valentino;Calimera, Andrea;Macii, Enrico
2024

Abstract

In Federated Learning (FL), network end-nodes use private data for the local training of classification models that are periodically synchronized with a remote server to update a global model. A key limitation in FL is the high network traffic required to achieve acceptable accuracy, particularly when each client has samples of only a few classes, a problem known as pathological labels skew. Under such constraints, local models become biased toward the available classes, and more synchronization rounds are needed to converge, saturating the communication budget. We address this issue with Draft & Refine, a new resource management strategy to optimize accuracy by dynamically controlling clients' participation based on their remaining budget. Our approach consists of two learning phases: a low-effort drafting phase with only a few clients selected for synchronization, followed by a refinement phase with increased client participation. The available data traffic is used as the control variable to set the duration of the two phases. The proposed strategy outperforms state-of-the-art FL schemes with up to 12.20 % higher accuracy within the same data traffic.
2024
979-8-3503-7720-0
File in questo prodotto:
File Dimensione Formato  
Draft_amp_Refine_Efficient_Resource_Management_in_Federated_Learning_under_Pathological_Labels_Skew.pdf

accesso riservato

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 230.57 kB
Formato Adobe PDF
230.57 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2992843