Deep Learning (DL) based methods for object detection achieve remarkable performance at the cost of computationally expensive training and extensive data labeling. Robots embodiment can be exploited to mitigate this burden by acquiring automatically annotated training data via a natural interaction with a human showing the object of interest, hand-held. However, learning solely from this data may introduce biases (the so-called domain shift), and prevents adaptation to novel tasks. While Weakly-supervised Learning offers a well-established set of techniques to cope with these problems in general-purpose Computer Vision, its adoption in challenging robotic domains is still at a preliminary stage. In this work, we target the scenario of a robot trained in a teacher-learner setting to detect handheld objects. The aim is to improve detection performance in different settings by letting the robot explore the environment with a limited human labeling budget. We compare several techniques for WSL in detection pipelines to reduce model re-training costs without compromising accuracy, proposing solutions which target the considered robotic scenario. We show that the robot can improve adaptation to novel domains, either by interacting with a human teacher (Active Learning) or with an autonomous supervision (Semi-supervised Learning). We integrate our strategies into an on-line detection method, achieving efficient model update capabilities with few labels. We experimentally benchmark our method on challenging robotic object detection tasks under domain shift.
From Handheld to Unconstrained Object Detection: a Weakly-supervised On-line Learning Approach / Maiettini, E; Maracani, A; Camoriano, R; Pasquale, G; Tikhanoff, V; Rosasco, L; Natale, L. - STAMPA. - (2022), pp. 942-949. (Intervento presentato al convegno 2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) tenutosi a Napoli (ITA) nel August 29 - September 2, 2022) [10.1109/RO-MAN53752.2022.9900780].
From Handheld to Unconstrained Object Detection: a Weakly-supervised On-line Learning Approach
Camoriano R;
2022
Abstract
Deep Learning (DL) based methods for object detection achieve remarkable performance at the cost of computationally expensive training and extensive data labeling. Robots embodiment can be exploited to mitigate this burden by acquiring automatically annotated training data via a natural interaction with a human showing the object of interest, hand-held. However, learning solely from this data may introduce biases (the so-called domain shift), and prevents adaptation to novel tasks. While Weakly-supervised Learning offers a well-established set of techniques to cope with these problems in general-purpose Computer Vision, its adoption in challenging robotic domains is still at a preliminary stage. In this work, we target the scenario of a robot trained in a teacher-learner setting to detect handheld objects. The aim is to improve detection performance in different settings by letting the robot explore the environment with a limited human labeling budget. We compare several techniques for WSL in detection pipelines to reduce model re-training costs without compromising accuracy, proposing solutions which target the considered robotic scenario. We show that the robot can improve adaptation to novel domains, either by interacting with a human teacher (Active Learning) or with an autonomous supervision (Semi-supervised Learning). We integrate our strategies into an on-line detection method, achieving efficient model update capabilities with few labels. We experimentally benchmark our method on challenging robotic object detection tasks under domain shift.File | Dimensione | Formato | |
---|---|---|---|
From_Handheld_to_Unconstrained_Object_Detection_a_Weakly-supervised_On-line_Learning_Approach.pdf
accesso riservato
Descrizione: postprint
Tipologia:
2a Post-print versione editoriale / Version of Record
Licenza:
Non Pubblico - Accesso privato/ristretto
Dimensione
1.7 MB
Formato
Adobe PDF
|
1.7 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2982134