The excellent accuracy of Recurrent Neural Networks (RNNs) for time-series and natural language processing comes at the cost of computational complexity. Therefore, the choice between edge and cloud computing for RNN inference, with the goal of minimizing response time or energy consumption, is not trivial. An edge approach must deal with the aforementioned complexity, while a cloud solution pays large time and energy costs for data transmission. Collaborative inference is a technique that tries to obtain the best of both worlds, by splitting the inference task among a network of collaborating devices. While already investigated for other types of neural networks, collaborative inference for RNNs poses completely new challenges, such as the strong influence of input length on processing time and energy, and is greatly unexplored.In this paper, we introduce a Collaborative RNN Inference Mapping Engine(CRIME), which automatically selects the best inference device for each input. CRIME is flexible with respect to the connection topology among collaborating devices, and adapts to changes in the connections statuses and in the devices loads. With experiments on several RNNs and datasets, we show that CRIME can reduce the execution time (or end-node energy) by more than 25% compared to any single-device approach.

CRIME: Input-Dependent Collaborative Inference for Recurrent Neural Networks / Jahier Pagliari, Daniele; Chiaro, Roberta; Macii, Enrico; Poncino, Massimo. - In: IEEE TRANSACTIONS ON COMPUTERS. - ISSN 0018-9340. - ELETTRONICO. - 70:10(2021), pp. 1626-1639. [10.1109/TC.2020.3021199]

CRIME: Input-Dependent Collaborative Inference for Recurrent Neural Networks

Jahier Pagliari, Daniele;Chiaro, Roberta;Macii, Enrico;Poncino, Massimo
2021

Abstract

The excellent accuracy of Recurrent Neural Networks (RNNs) for time-series and natural language processing comes at the cost of computational complexity. Therefore, the choice between edge and cloud computing for RNN inference, with the goal of minimizing response time or energy consumption, is not trivial. An edge approach must deal with the aforementioned complexity, while a cloud solution pays large time and energy costs for data transmission. Collaborative inference is a technique that tries to obtain the best of both worlds, by splitting the inference task among a network of collaborating devices. While already investigated for other types of neural networks, collaborative inference for RNNs poses completely new challenges, such as the strong influence of input length on processing time and energy, and is greatly unexplored.In this paper, we introduce a Collaborative RNN Inference Mapping Engine(CRIME), which automatically selects the best inference device for each input. CRIME is flexible with respect to the connection topology among collaborating devices, and adapts to changes in the connections statuses and in the devices loads. With experiments on several RNNs and datasets, we show that CRIME can reduce the execution time (or end-node energy) by more than 25% compared to any single-device approach.
File in questo prodotto:
File Dimensione Formato  
Edge_cloud_mapping.pdf

accesso aperto

Descrizione: Articolo principale (post-print)
Tipologia: 2. Post-print / Author's Accepted Manuscript
Licenza: PUBBLICO - Tutti i diritti riservati
Dimensione 1.92 MB
Formato Adobe PDF
1.92 MB Adobe PDF Visualizza/Apri
Jahier Pagliari et al. - 2020 - CRIME Input-Dependent Collaborative Inference for Recurrent Neural Networks(2).pdf

non disponibili

Descrizione: Articolo principale (versione editoriale)
Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 2.64 MB
Formato Adobe PDF
2.64 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2844792