Soft hands are robotic systems that embed compliant elements in their mechanical design. This enables an effective adaptation with the items and the environment, and ultimately, an increase in their grasping performance. These hands come with clear advantages in terms of ease-to-use and robustness if compared with classic rigid hands, when operated by a human. However, their potential for autonomous grasping is still largely unexplored, due to the lack of suitable control strategies. To address this issue, in this letter, we propose an approach to enable soft hands to autonomously grasp objects, starting from the observations of human strategies. A classifier realized through a deep neural network takes as input the visual information on the object to be grasped, and predicts which action a human would perform to achieve the goal. This information is hence used to select one among a set of human-inspired primitives, which define the evolution of the soft hand posture as a combination of anticipatory action and touch-based reactive grasp. The architecture is completed by the hardware component, which consists of an RGB camera to look at the scene, a 7-DoF manipulator, and a soft hand. The latter is equipped with inertial measurement units at the fingernails for detecting contact with the object. We extensively tested the proposed architecture with 20 objects, achieving a success rate of 81.1% over 111 grasps.
Learning from humans how to grasp: A data-driven architecture for autonomous grasping with anthropomorphic soft hands / Santina, C. D.; Arapi, V.; Averta, G.; Damiani, F.; Fiore, G.; Settimi, A.; Catalano, M. G.; Bacciu, D.; Bicchi, A.; Bianchi, M.. - In: IEEE ROBOTICS AND AUTOMATION LETTERS. - ISSN 2377-3766. - 4:2(2019), pp. 1533-1540. [10.1109/LRA.2019.2896485]
Learning from humans how to grasp: A data-driven architecture for autonomous grasping with anthropomorphic soft hands
Averta G.;
2019
Abstract
Soft hands are robotic systems that embed compliant elements in their mechanical design. This enables an effective adaptation with the items and the environment, and ultimately, an increase in their grasping performance. These hands come with clear advantages in terms of ease-to-use and robustness if compared with classic rigid hands, when operated by a human. However, their potential for autonomous grasping is still largely unexplored, due to the lack of suitable control strategies. To address this issue, in this letter, we propose an approach to enable soft hands to autonomously grasp objects, starting from the observations of human strategies. A classifier realized through a deep neural network takes as input the visual information on the object to be grasped, and predicts which action a human would perform to achieve the goal. This information is hence used to select one among a set of human-inspired primitives, which define the evolution of the soft hand posture as a combination of anticipatory action and touch-based reactive grasp. The architecture is completed by the hardware component, which consists of an RGB camera to look at the scene, a 7-DoF manipulator, and a soft hand. The latter is equipped with inertial measurement units at the fingernails for detecting contact with the object. We extensively tested the proposed architecture with 20 objects, achieving a success rate of 81.1% over 111 grasps.File | Dimensione | Formato | |
---|---|---|---|
Learning_From_Humans_How_to_Grasp_A_Data-Driven_Architecture_for_Autonomous_Grasping_With_Anthropomorphic_Soft_Hands.pdf
accesso riservato
Tipologia:
2a Post-print versione editoriale / Version of Record
Licenza:
Non Pubblico - Accesso privato/ristretto
Dimensione
4.12 MB
Formato
Adobe PDF
|
4.12 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
Della_Santina_et_al_20.pdf
accesso aperto
Tipologia:
2. Post-print / Author's Accepted Manuscript
Licenza:
Pubblico - Tutti i diritti riservati
Dimensione
8.88 MB
Formato
Adobe PDF
|
8.88 MB | Adobe PDF | Visualizza/Apri |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2970293