Federated Learning (FL) is a distributed machine learning technique in which multiple clients collaboratively train a global classification model while ensuring that private data stays decentralized. However, under skewed label distributions, FL struggles to achieve high accuracy. The problem is further exacerbated when clients have limited energy budgets that prevent their full participation during the training flow. To address these limitations, we propose FL with Adaptive Concurrency via Gradient Feedback (FedAGF), a control strategy that dynamically adapts the number of clients selected for synchronization. FedAGF implements continuous monitoring of the global model updates and increases concurrency when needed, that is, when learning stalls, optimizing the allocation of clients' budgets. Experimental results on the CIFAR-10 and CIFAR-100 datasets demonstrate that FedAGF improves accuracy by up to 14.14% compared to the existing methods.

Adaptive Client Participation in Budget-Constrained Federated Learning with Extreme Label Skew / Malan, Erich; Peluso, Valentino; Calimera, Andrea; Macii, Enrico. - (2025), pp. 15-19. (Intervento presentato al convegno 2025 23rd IEEE Interregional NEWCAS Conference (NEWCAS) tenutosi a Paris (FRA) nel 22-25 June 2025) [10.1109/newcas64648.2025.11107023].

Adaptive Client Participation in Budget-Constrained Federated Learning with Extreme Label Skew

Malan, Erich;Peluso, Valentino;Calimera, Andrea;Macii, Enrico
2025

Abstract

Federated Learning (FL) is a distributed machine learning technique in which multiple clients collaboratively train a global classification model while ensuring that private data stays decentralized. However, under skewed label distributions, FL struggles to achieve high accuracy. The problem is further exacerbated when clients have limited energy budgets that prevent their full participation during the training flow. To address these limitations, we propose FL with Adaptive Concurrency via Gradient Feedback (FedAGF), a control strategy that dynamically adapts the number of clients selected for synchronization. FedAGF implements continuous monitoring of the global model updates and increases concurrency when needed, that is, when learning stalls, optimizing the allocation of clients' budgets. Experimental results on the CIFAR-10 and CIFAR-100 datasets demonstrate that FedAGF improves accuracy by up to 14.14% compared to the existing methods.
2025
979-8-3315-3256-7
File in questo prodotto:
File Dimensione Formato  
NEWCAS_2025___Camera_Ready___Deadline_2_Maggio.pdf

accesso riservato

Tipologia: 2. Post-print / Author's Accepted Manuscript
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 247.1 kB
Formato Adobe PDF
247.1 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Adaptive_Client_Participation_in_Budget-Constrained_Federated_Learning_with_Extreme_Label_Skew.pdf

accesso riservato

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 342.69 kB
Formato Adobe PDF
342.69 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/3002807