Event cameras are novel bio-inspired sensors, which asynchronously capture pixel-level intensity changes in the form of “events”. Due to their sensing mechanism, event cameras have little to no motion blur, a very high temporal resolution and require significantly less power and memory than traditional frame-based cameras. These characteristics make them a perfect fit to several real-world applications such as egocentric action recognition on wearable devices, where fast camera motion and limited power challenge traditional vision sensors. However, the ever-growing field of event-based vision has, to date, overlooked the potential of event cameras in such applications. In this paper, we show that event data is a very valuable modality for egocentric action recognition. To do so, we introduce N-EPIC-Kitchens, the first event-based camera extension of the large-scale EPIC-Kitchens dataset. In this context, we propose two strategies: (i) directly processing event-camera data with traditional video-processing architectures (E 2 (GO)) and (ii) using event-data to distill optical flow information (E 2 (GO)MO). On our proposed benchmark, we show that event data provides a comparable performance to RGB and optical flow, yet without any additional flow computation at deploy time, and an improved performance of up to 4% with respect to RGB only information. The N-EPIC-Kitchens dataset is available at https://github.com/EgocentricVision/N-EPIC-Kitchens.

E2(GO)MOTION: Motion Augmented Event Stream for Egocentric Action Recognition / Plizzari, Chiara; Planamente, Mirco; Goletto, Gabriele; Cannici, Marco; Gusso, Emanuele; Matteucci, Matteo; Caputo, Barbara. - (2022), pp. 19903-19915. (Intervento presentato al convegno IEEE Conference on Computer Vision and Pattern Recognition, CVPR2022 tenutosi a New Orleans, LA (USA) nel 18-24 June 2022) [10.1109/CVPR52688.2022.01931].

E2(GO)MOTION: Motion Augmented Event Stream for Egocentric Action Recognition

Plizzari, Chiara;Planamente, Mirco;Goletto, Gabriele;Gusso, Emanuele;Matteucci, Matteo;Caputo, Barbara
2022

Abstract

Event cameras are novel bio-inspired sensors, which asynchronously capture pixel-level intensity changes in the form of “events”. Due to their sensing mechanism, event cameras have little to no motion blur, a very high temporal resolution and require significantly less power and memory than traditional frame-based cameras. These characteristics make them a perfect fit to several real-world applications such as egocentric action recognition on wearable devices, where fast camera motion and limited power challenge traditional vision sensors. However, the ever-growing field of event-based vision has, to date, overlooked the potential of event cameras in such applications. In this paper, we show that event data is a very valuable modality for egocentric action recognition. To do so, we introduce N-EPIC-Kitchens, the first event-based camera extension of the large-scale EPIC-Kitchens dataset. In this context, we propose two strategies: (i) directly processing event-camera data with traditional video-processing architectures (E 2 (GO)) and (ii) using event-data to distill optical flow information (E 2 (GO)MO). On our proposed benchmark, we show that event data provides a comparable performance to RGB and optical flow, yet without any additional flow computation at deploy time, and an improved performance of up to 4% with respect to RGB only information. The N-EPIC-Kitchens dataset is available at https://github.com/EgocentricVision/N-EPIC-Kitchens.
File in questo prodotto:
File Dimensione Formato  
CVPR22_EVEGO_camera_ready.pdf

accesso aperto

Tipologia: 2. Post-print / Author's Accepted Manuscript
Licenza: Pubblico - Tutti i diritti riservati
Dimensione 5.18 MB
Formato Adobe PDF
5.18 MB Adobe PDF Visualizza/Apri
E2GOMOTION_Motion_Augmented_Event_Stream_for_Egocentric_Action_Recognition.pdf

accesso riservato

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 1.22 MB
Formato Adobe PDF
1.22 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2970228