Federated Learning (FL) is a privacy-preserving machine learning strategy where distributed clients share updates of locally trained models with a central server. The server aggregates those updates to refine a global version of the model without accessing the clients' data. Even if the raw data never leaves clients, adversarial attacks on the server side can still extract sensitive information from the transmitted model updates. Homomorphic Encryption (HE) offers a robust solution to this privacy concern: clients send encrypted model updates to the server; the server operates the aggregation of the received updates without having to decrypt. Unfortunately, HE leads to a substantial computational and communication overhead on both the client and the server side, preventing the adoption in practical, real-life applications. In this work, we introduce a training option to mitigate this problem by promoting Private Tensor Freezing (PTF), a progressive and secure gating scheme by which the number of model tensors involved in the training and synchronization stages gradually reduces over time, alleviating (i) the pressure of HE encryption/decryption on the client side, (ii) the communication volumes from/to the server, and (iii) the computing complexity of the aggregation stage on the server side. Experiments on four image classification benchmarks trained within a state-of-the-art FL framework secured with CKKS encryption reveal the effectiveness of PTF: up to 37.4% in data volume reduction, 35.5% less compute time on the client side, and 36.4% less compute time on the server side.
Private Tensor Freezing for an Efficient Federated Learning with Homomorphic Encryption / Peluso, Valentino; Malan, Erich; Calimera, Andrea; Macii, Enrico. - (2024), pp. 308-315. (Intervento presentato al convegno International Conference on Computer Design (ICCD) 2024 tenutosi a Milan (ITA) nel 18-20 November 2024) [10.1109/ICCD63220.2024.00054].
Private Tensor Freezing for an Efficient Federated Learning with Homomorphic Encryption
Peluso, Valentino;Malan, Erich;Calimera, Andrea;Macii, Enrico
2024
Abstract
Federated Learning (FL) is a privacy-preserving machine learning strategy where distributed clients share updates of locally trained models with a central server. The server aggregates those updates to refine a global version of the model without accessing the clients' data. Even if the raw data never leaves clients, adversarial attacks on the server side can still extract sensitive information from the transmitted model updates. Homomorphic Encryption (HE) offers a robust solution to this privacy concern: clients send encrypted model updates to the server; the server operates the aggregation of the received updates without having to decrypt. Unfortunately, HE leads to a substantial computational and communication overhead on both the client and the server side, preventing the adoption in practical, real-life applications. In this work, we introduce a training option to mitigate this problem by promoting Private Tensor Freezing (PTF), a progressive and secure gating scheme by which the number of model tensors involved in the training and synchronization stages gradually reduces over time, alleviating (i) the pressure of HE encryption/decryption on the client side, (ii) the communication volumes from/to the server, and (iii) the computing complexity of the aggregation stage on the server side. Experiments on four image classification benchmarks trained within a state-of-the-art FL framework secured with CKKS encryption reveal the effectiveness of PTF: up to 37.4% in data volume reduction, 35.5% less compute time on the client side, and 36.4% less compute time on the server side.File | Dimensione | Formato | |
---|---|---|---|
Private_Tensor_Freezing_for_an_Efficient_Federated_Learning_with_Homomorphic_Encryption.pdf
accesso riservato
Tipologia:
2a Post-print versione editoriale / Version of Record
Licenza:
Non Pubblico - Accesso privato/ristretto
Dimensione
452.66 kB
Formato
Adobe PDF
|
452.66 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2992844