In the perspective of Industry 4.0, the contemporary presence of workers and robots in the same workspace requires the development of human motion prediction algorithms for a safe and efficient interaction. In this context, the purpose of the present study was to perform an operation of sensor fusion, by creating a collection of spatial and inertial variables of human upper limbs kinematics of typical industrial movements. Spatial and inertial data of ten healthy young subjects performing three pick and place gestures at different heights were measured with a stereophotogrammetric system and Inertial Measurement Units, respectively. Elbow and shoulder angles estimated from both instruments according to a multibody approach showed very similar trends. Moreover, two variables of the database were identified as distinctive features able to differentiate among the three gestures of pick and place.

Upper limbs motion tracking for collaborative robotic applications / Digo, E.; Antonelli, M.; Pastorelli, S.; Gastaldi, L.. - 1253:(2021), pp. 391-397. (Intervento presentato al convegno 3rd International Conference on Human Interaction and Emerging Technologies: Future Applications, IHIET 2020 tenutosi a fra nel 2020) [10.1007/978-3-030-55307-4_59].

Upper limbs motion tracking for collaborative robotic applications

Digo E.;Antonelli M.;Pastorelli S.;Gastaldi L.
2021

Abstract

In the perspective of Industry 4.0, the contemporary presence of workers and robots in the same workspace requires the development of human motion prediction algorithms for a safe and efficient interaction. In this context, the purpose of the present study was to perform an operation of sensor fusion, by creating a collection of spatial and inertial variables of human upper limbs kinematics of typical industrial movements. Spatial and inertial data of ten healthy young subjects performing three pick and place gestures at different heights were measured with a stereophotogrammetric system and Inertial Measurement Units, respectively. Elbow and shoulder angles estimated from both instruments according to a multibody approach showed very similar trends. Moreover, two variables of the database were identified as distinctive features able to differentiate among the three gestures of pick and place.
2021
978-3-030-55306-7
978-3-030-55307-4
File in questo prodotto:
File Dimensione Formato  
Upper limbs motion tracking for collaborative robotic applications.pdf

non disponibili

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 2.83 MB
Formato Adobe PDF
2.83 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Upper limbs motion tracking for collaborative robotic applications_accepted.pdf

Open Access dal 07/08/2021

Tipologia: 2. Post-print / Author's Accepted Manuscript
Licenza: PUBBLICO - Tutti i diritti riservati
Dimensione 1.06 MB
Formato Adobe PDF
1.06 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2845983