Federated learning (FL) is a collaborative, privacy-preserving method for training deep neural networks at the edge of the Internet of Things (IoT). Despite the many advantages, existing FL implementations suffer high communication costs that prevent adoption at scale. Specifically, the frequent model updates between the central server and the many end nodes are a source of channel congestion and high energy consumption. This letter tackles this aspect by introducing federated learning with gradual layer freezing (FedGLF), a novel FL scheme that gradually reduces the portion of the model sent back and forth, relieving the communication bundle yet preserving the quality of the training service. The results collected on two image classification tasks learned with different data distributions prove that FedGLF outperforms conventional FL schemes, with data volume savings ranging from 14% to 59% or up to 2.5% higher accuracy.

Communication-Efficient Federated Learning with Gradual Layer Freezing / Malan, Erich; Peluso, Valentino; Calimera, Andrea; Macii, Enrico. - In: IEEE EMBEDDED SYSTEMS LETTERS. - ISSN 1943-0663. - 15:1(2023), pp. 25-28. [10.1109/LES.2022.3190682]

Communication-Efficient Federated Learning with Gradual Layer Freezing

Erich Malan;Valentino Peluso;Andrea Calimera;Enrico Macii
2023

Abstract

Federated learning (FL) is a collaborative, privacy-preserving method for training deep neural networks at the edge of the Internet of Things (IoT). Despite the many advantages, existing FL implementations suffer high communication costs that prevent adoption at scale. Specifically, the frequent model updates between the central server and the many end nodes are a source of channel congestion and high energy consumption. This letter tackles this aspect by introducing federated learning with gradual layer freezing (FedGLF), a novel FL scheme that gradually reduces the portion of the model sent back and forth, relieving the communication bundle yet preserving the quality of the training service. The results collected on two image classification tasks learned with different data distributions prove that FedGLF outperforms conventional FL schemes, with data volume savings ranging from 14% to 59% or up to 2.5% higher accuracy.
File in questo prodotto:
File Dimensione Formato  
main.pdf

accesso aperto

Tipologia: 2. Post-print / Author's Accepted Manuscript
Licenza: Pubblico - Tutti i diritti riservati
Dimensione 208.08 kB
Formato Adobe PDF
208.08 kB Adobe PDF Visualizza/Apri
Communication-Efficient_Federated_Learning_With_Gradual_Layer_Freezing.pdf

accesso riservato

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 582.43 kB
Formato Adobe PDF
582.43 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2972826