Human Activity Recognition (HAR) is a relevant inference task in many mobile applications. State-of-the-art HAR at the edge is typically achieved with lightweight machine learning models such as decision trees and Random Forests (RFs), whereas deep learning is less common due to its high computational complexity. In this work, we propose a novel implementation of HAR based on deep neural networks, and precisely on Binary Neural Networks (BNNs), targeting low-power general purpose processors with a RISC-V instruction set. BNNs yield very small memory footprints and low inference complexity, thanks to the replacement of arithmetic operations with bit-wise ones. However, existing BNN implementations on general purpose processors impose constraints tailored to complex computer vision tasks, which result in over-parametrized models for simpler problems like HAR. Therefore, we also introduce a new BNN inference library, which targets ultra-compact models explicitly. With experiments on a single-core RISC-V processor, we show that BNNs trained on two HAR datasets obtain higher classification accuracy compared to a state-of-the-art baseline based on RFs. Furthermore, our BNN reaches the same accuracy of a RF with either less memory (up to 91%) or more energy-efficiency (up to 70%), depending on the complexity of the features extracted by the RF.

Ultra-compact binary neural networks for human activity recognition on RISC-V processors / Daghero, F.; Xie, C.; Jahier Pagliari, D.; Burrello, A.; Castellano, M.; Gandolfi, L.; Calimera, A.; Macii, E.; Poncino, M.. - ELETTRONICO. - (2021), pp. 3-11. (Intervento presentato al convegno 18th ACM International Conference on Computing Frontiers 2021, CF 2021 tenutosi a Virtual conference nel 2021) [10.1145/3457388.3458656].

Ultra-compact binary neural networks for human activity recognition on RISC-V processors

Daghero F.;Xie C.;Jahier Pagliari D.;Burrello A.;Calimera A.;MacIi E.;Poncino M.
2021

Abstract

Human Activity Recognition (HAR) is a relevant inference task in many mobile applications. State-of-the-art HAR at the edge is typically achieved with lightweight machine learning models such as decision trees and Random Forests (RFs), whereas deep learning is less common due to its high computational complexity. In this work, we propose a novel implementation of HAR based on deep neural networks, and precisely on Binary Neural Networks (BNNs), targeting low-power general purpose processors with a RISC-V instruction set. BNNs yield very small memory footprints and low inference complexity, thanks to the replacement of arithmetic operations with bit-wise ones. However, existing BNN implementations on general purpose processors impose constraints tailored to complex computer vision tasks, which result in over-parametrized models for simpler problems like HAR. Therefore, we also introduce a new BNN inference library, which targets ultra-compact models explicitly. With experiments on a single-core RISC-V processor, we show that BNNs trained on two HAR datasets obtain higher classification accuracy compared to a state-of-the-art baseline based on RFs. Furthermore, our BNN reaches the same accuracy of a RF with either less memory (up to 91%) or more energy-efficiency (up to 70%), depending on the complexity of the features extracted by the RF.
2021
9781450384049
File in questo prodotto:
File Dimensione Formato  
BNN_CF-post-print.pdf

accesso aperto

Descrizione: Articolo principale (post-print)
Tipologia: 2. Post-print / Author's Accepted Manuscript
Licenza: Pubblico - Tutti i diritti riservati
Dimensione 4.57 MB
Formato Adobe PDF
4.57 MB Adobe PDF Visualizza/Apri
Daghero et al. - 2021 - Ultra-Compact Binary Neural Networks for Human Activity Recognition on RISC-V Processors.pdf

accesso riservato

Descrizione: Articolo principale (versione editoriale)
Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 343.16 kB
Formato Adobe PDF
343.16 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2909396