When considering close human-robot collaboration, perception plays a central role in order to guarantee a safe and intuitive interaction. In this work, we present an AI-based perception system composed of different modules to understand human activities at multiple levels, namely: human pose estimation, body parts segmentation and human action recognition. Pose estimation and body parts segmentation allow to estimate important information about the worker position within the workcell and the volume occupied, while human action and intention recognition provides information on what the human is doing and how he/she is performing a certain action. The proposed system is demonstrated in a mockup scenario targeting the collaborative assembly of a wooden leg table, highlighting the potential of action recognition and body parts segmentation to enable a safe and natural close human-robot collaboration.

Towards a holistic human perception system for close human-robot collaboration / Terreran, M.; Barcellona, L.; Allegro, D.; Ghidoni, S.. - 3417:(2023), pp. 1-7. (Intervento presentato al convegno 9th Italian Workshop on Artificial Intelligence and Robotics, AIRO 2022 tenutosi a Udine (ITA) nel November 30, 2022).

Towards a holistic human perception system for close human-robot collaboration

Barcellona L.;
2023

Abstract

When considering close human-robot collaboration, perception plays a central role in order to guarantee a safe and intuitive interaction. In this work, we present an AI-based perception system composed of different modules to understand human activities at multiple levels, namely: human pose estimation, body parts segmentation and human action recognition. Pose estimation and body parts segmentation allow to estimate important information about the worker position within the workcell and the volume occupied, while human action and intention recognition provides information on what the human is doing and how he/she is performing a certain action. The proposed system is demonstrated in a mockup scenario targeting the collaborative assembly of a wooden leg table, highlighting the potential of action recognition and body parts segmentation to enable a safe and natural close human-robot collaboration.
2023
File in questo prodotto:
File Dimensione Formato  
short1.pdf

accesso aperto

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Creative commons
Dimensione 3.84 MB
Formato Adobe PDF
3.84 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2981974