Human Activity Recognition (HAR) has become an increasingly popular task for embedded devices such as smartwatches. Most HAR systems for ultra-low power devices are based on classic Machine Learning (ML) models, whereas Deep Learning (DL), although reaching state-of-the-art accuracy, is less popular due to its high energy consumption, which poses a significant challenge for battery-operated and resource-constrained devices. In this work, we bridge the gap between on-device HAR and DL thanks to a hierarchical architecture composed of a decision tree (DT) and a one dimensional Convolutional Neural Network (ID CNN). The two classifiers operate in a cascaded fashion on two different sub-tasks: the DT classifies only the easiest activities, while the CNN deals with more complex ones. With experiments on a state-of-the-art dataset and targeting a single-core RISC-V MCU, we show that this approach allows to save up to 67.7% energy w.r.t. a 'stand-alone' DL architecture at iso-accuracy. Additionally, the two-stage system either introduces a negligible memory overhead (up to 200 B) or on the contrary, reduces the total memory occupation.
Two-stage Human Activity Recognition on Microcontrollers with Decision Trees and CNNs / Daghero, Francesco; Jahier Pagliari, Daniele; Poncino, Massimo. - ELETTRONICO. - (2022), pp. 173-176. (Intervento presentato al convegno 17th International Conference on Ph.D Research in Microelectronics and Electronics, PRIME 2022 tenutosi a Villasimius (ITA) nel 2022) [10.1109/PRIME55000.2022.9816745].
Two-stage Human Activity Recognition on Microcontrollers with Decision Trees and CNNs
Daghero, Francesco;Jahier Pagliari, Daniele;Poncino, Massimo
2022
Abstract
Human Activity Recognition (HAR) has become an increasingly popular task for embedded devices such as smartwatches. Most HAR systems for ultra-low power devices are based on classic Machine Learning (ML) models, whereas Deep Learning (DL), although reaching state-of-the-art accuracy, is less popular due to its high energy consumption, which poses a significant challenge for battery-operated and resource-constrained devices. In this work, we bridge the gap between on-device HAR and DL thanks to a hierarchical architecture composed of a decision tree (DT) and a one dimensional Convolutional Neural Network (ID CNN). The two classifiers operate in a cascaded fashion on two different sub-tasks: the DT classifies only the easiest activities, while the CNN deals with more complex ones. With experiments on a state-of-the-art dataset and targeting a single-core RISC-V MCU, we show that this approach allows to save up to 67.7% energy w.r.t. a 'stand-alone' DL architecture at iso-accuracy. Additionally, the two-stage system either introduces a negligible memory overhead (up to 200 B) or on the contrary, reduces the total memory occupation.| File | Dimensione | Formato | |
|---|---|---|---|
| Two-stage_Human_Activity_Recognition_on_Microcontrollers_with_Decision_Trees_and_CNNs.pdf accesso riservato 
											Descrizione: Articolo principale (versione editoriale)
										 
											Tipologia:
											2a Post-print versione editoriale / Version of Record
										 
											Licenza:
											
											
												Non Pubblico - Accesso privato/ristretto
												
												
												
											
										 
										Dimensione
										643.35 kB
									 
										Formato
										Adobe PDF
									 | 643.35 kB | Adobe PDF | Visualizza/Apri Richiedi una copia | 
| Two-stage_Human_Activity_Recognition_on_Microcontrollers_with_Decision_Trees_and_CNNs-POSTPRINT.pdf accesso aperto 
											Descrizione: Articolo principale (post-print)
										 
											Tipologia:
											2. Post-print / Author's Accepted Manuscript
										 
											Licenza:
											
											
												Pubblico - Tutti i diritti riservati
												
												
												
											
										 
										Dimensione
										367.71 kB
									 
										Formato
										Adobe PDF
									 | 367.71 kB | Adobe PDF | Visualizza/Apri | 
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2971286
			
		
	
	
	
			      	