Event cameras are novel neuromorphic sensors, which asynchronously capture pixel-level intensity changes in the form of ``events". Event simulation from existing RGB datasets is commonly used to overcome the need of large amount of annotated data, which lacks due to the novelty of the event sensors. In this context, the possibility of using event simulation in synthetic scenarios, where data generation is not limited to pre-existing datasets, is to date still unexplored. In this work, we analyze the synth-to-real domain shift in event data, i.e., the gap arising between simulated events obtained from synthetic renderings and those captured with a real camera on real images. To this purpose, we extend to the event modality the popular RGB-D Object Dataset (ROD), which already comes with its synthetic version (SynROD). The resulting extended dataset is the first to enable a synth-to-real analysis on event data. On the proposed Neuromorphic ROD dataset (N-ROD), we show how Domain Adaptation techniques can be used to reduce the synth-to-real gap. Moreover, through extensive experiments on multi-modal RGB-E data, we show that events can be effectively combined with conventional visual information, encouraging further research in this area.

N-ROD: a Neuromorphic Dataset for Synthetic-to-Real Domain Adaptation / Cannici, Marco; Plizzari, Chiara; Planamente, Mirco; Ciccone, Marco; Bottino, Andrea; Caputo, Barbara; Matteucci, Matteo. - ELETTRONICO. - (2021), pp. 1342-1347. (Intervento presentato al convegno CVPR 2021 Workshop on Event-based Vision tenutosi a Virtual conference nel 19-25 June 2021) [10.1109/CVPRW53098.2021.00148].

N-ROD: a Neuromorphic Dataset for Synthetic-to-Real Domain Adaptation

Chiara Plizzari;Mirco Planamente;Marco Ciccone;Andrea Bottino;Barbara Caputo;
2021

Abstract

Event cameras are novel neuromorphic sensors, which asynchronously capture pixel-level intensity changes in the form of ``events". Event simulation from existing RGB datasets is commonly used to overcome the need of large amount of annotated data, which lacks due to the novelty of the event sensors. In this context, the possibility of using event simulation in synthetic scenarios, where data generation is not limited to pre-existing datasets, is to date still unexplored. In this work, we analyze the synth-to-real domain shift in event data, i.e., the gap arising between simulated events obtained from synthetic renderings and those captured with a real camera on real images. To this purpose, we extend to the event modality the popular RGB-D Object Dataset (ROD), which already comes with its synthetic version (SynROD). The resulting extended dataset is the first to enable a synth-to-real analysis on event data. On the proposed Neuromorphic ROD dataset (N-ROD), we show how Domain Adaptation techniques can be used to reduce the synth-to-real gap. Moreover, through extensive experiments on multi-modal RGB-E data, we show that events can be effectively combined with conventional visual information, encouraging further research in this area.
File in questo prodotto:
File Dimensione Formato  
N-ROD_a_Neuromorphic_Dataset_for_Synthetic-to-Real_Domain_Adaptation.pdf

non disponibili

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 7.57 MB
Formato Adobe PDF
7.57 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2894079