The need for users' safety and technology accept-ability has incredibly increased with the deployment of co-bots physically interacting with humans in industrial settings, and for people assistance. A well-studied approach to meet these requirements is to ensure human-like robot motions. Classic solutions for anthropomorphic movement generation usually rely on optimization procedures, which build upon hypotheses devised from neuroscientific literature, or capitalize on learning methods. However, these approaches come with limitations, e.g. limited motion variability or the need for high dimensional datasets. In this work, we present a technique to directly embed human upper limb principal motion modes computed through functional analysis in the robot trajectory optimization. We report on the implementation with manipulators with redundant anthropomorphic kinematic architectures - although dissimilar with respect to the human model used for functional mode extraction - via Cartesian impedance control. In our experiments, we show how human trajectories mapped onto a robotic manipulator still exhibit the main characteristics of human-likeness, e.g. low jerk values. We discuss the results with respect to the state of the art, and their implications for advanced human-robot interaction in industrial co-botics and for human assistance.
A technical framework for human-like motion generation with autonomous anthropomorphic redundant manipulators / Averta, G.; Caporale, D.; Della Santina, C.; Bicchi, A.; Bianchi, M.. - (2020), pp. 3853-3859. (Intervento presentato al convegno 2020 IEEE International Conference on Robotics and Automation, ICRA 2020 tenutosi a Paris (FRA) nel 2020) [10.1109/ICRA40945.2020.9196937].
A technical framework for human-like motion generation with autonomous anthropomorphic redundant manipulators
Averta G.;
2020
Abstract
The need for users' safety and technology accept-ability has incredibly increased with the deployment of co-bots physically interacting with humans in industrial settings, and for people assistance. A well-studied approach to meet these requirements is to ensure human-like robot motions. Classic solutions for anthropomorphic movement generation usually rely on optimization procedures, which build upon hypotheses devised from neuroscientific literature, or capitalize on learning methods. However, these approaches come with limitations, e.g. limited motion variability or the need for high dimensional datasets. In this work, we present a technique to directly embed human upper limb principal motion modes computed through functional analysis in the robot trajectory optimization. We report on the implementation with manipulators with redundant anthropomorphic kinematic architectures - although dissimilar with respect to the human model used for functional mode extraction - via Cartesian impedance control. In our experiments, we show how human trajectories mapped onto a robotic manipulator still exhibit the main characteristics of human-likeness, e.g. low jerk values. We discuss the results with respect to the state of the art, and their implications for advanced human-robot interaction in industrial co-botics and for human assistance.File | Dimensione | Formato | |
---|---|---|---|
A_technical_framework_for_human-like_motion_generation_with_autonomous_anthropomorphic_redundant_manipulators.pdf
non disponibili
Tipologia:
2a Post-print versione editoriale / Version of Record
Licenza:
Non Pubblico - Accesso privato/ristretto
Dimensione
3.9 MB
Formato
Adobe PDF
|
3.9 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
paper_planning_HL_robots_IC_ICRA20201_1.pdf
accesso aperto
Tipologia:
2. Post-print / Author's Accepted Manuscript
Licenza:
PUBBLICO - Tutti i diritti riservati
Dimensione
4.99 MB
Formato
Adobe PDF
|
4.99 MB | Adobe PDF | Visualizza/Apri |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2970287