In environments where humans and robots interact and collaborate, the robot needs awareness of human actions and intentions to guarantee operational effectiveness and agent safety. In this paper, an online human action recognition system from event camera data is presented. A 3D Convolutional Neural Network (3DCNN) is built to classify event-based videos of isolated primitive assembly actions (idle, pick, place, screw), and then leveraged into classifying videos of action sequences. The 3DCNN is integrated into a system capable of acquiring, processing, and classifying real-time event data of assembly tasks. The proposed online system classifies the streamed data every 200 ms without accumulating delay.
Online Classification of Human Gestures Through Event Camera Data Using a 3DCNN / Vico, Livia; Polito, Michele; Duarte, Laura; Pastorelli, Stefano; Gastaldi, Laura; Neto, Pedro. - (2025), pp. 52-57. (Intervento presentato al convegno 2025 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC) tenutosi a Funchal, Portugal nel 02-03 April 2025) [10.1109/icarsc65809.2025.10970178].
Online Classification of Human Gestures Through Event Camera Data Using a 3DCNN
Polito, Michele;Pastorelli, Stefano;Gastaldi, Laura;
2025
Abstract
In environments where humans and robots interact and collaborate, the robot needs awareness of human actions and intentions to guarantee operational effectiveness and agent safety. In this paper, an online human action recognition system from event camera data is presented. A 3D Convolutional Neural Network (3DCNN) is built to classify event-based videos of isolated primitive assembly actions (idle, pick, place, screw), and then leveraged into classifying videos of action sequences. The 3DCNN is integrated into a system capable of acquiring, processing, and classifying real-time event data of assembly tasks. The proposed online system classifies the streamed data every 200 ms without accumulating delay.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2999570
Attenzione
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo