Drawing inspiration from biology, we describe the way in which visual sensing with a monocular camera can provide a reliable signal for navigation of mobile robots. The work takes inspiration from the classic paper which described a behavioral strategy pursued by diving sea birds based on a visual cue called time-to-contact. A closely related concept of time-to-transit, tau, is defined, and it is shown that steering laws based on monocular camera perceptions of tau can reliably steer a mobile vehicle. The contribution of the paper is two-fold. It provides a simple theory of robust vision-based steering control. It goes on to show how the theory guides the implementation of robust visual navigation using ROS-Gazebo simulations as well as deployment and experiments with a camera-equipped Jackal robot. As will be noted, there is an extensive literature on how animals use optical flow to guide their movements. The novelty of the work below is the introduction of the concepts of Eulerian optical flow and time-to-transit, tau and the demonstration that control laws based on the tau values associated with an aggregated set of features in the field of view can be used to reliably steer a laboratory robot.

Visual Navigation Using Sparse Optical Flow and Time-to-Transit / Boretti, Chiara; Bich, Philippe; Zhang, Yanyu; Baillieul, John. - ELETTRONICO. - (2022), pp. 9397-9403. (Intervento presentato al convegno 2022 International Conference on Robotics and Automation (ICRA) tenutosi a Philadelphia, PA, USA nel 23-27 May 2022) [10.1109/ICRA46639.2022.9812032].

Visual Navigation Using Sparse Optical Flow and Time-to-Transit

Boretti, Chiara;Bich, Philippe;
2022

Abstract

Drawing inspiration from biology, we describe the way in which visual sensing with a monocular camera can provide a reliable signal for navigation of mobile robots. The work takes inspiration from the classic paper which described a behavioral strategy pursued by diving sea birds based on a visual cue called time-to-contact. A closely related concept of time-to-transit, tau, is defined, and it is shown that steering laws based on monocular camera perceptions of tau can reliably steer a mobile vehicle. The contribution of the paper is two-fold. It provides a simple theory of robust vision-based steering control. It goes on to show how the theory guides the implementation of robust visual navigation using ROS-Gazebo simulations as well as deployment and experiments with a camera-equipped Jackal robot. As will be noted, there is an extensive literature on how animals use optical flow to guide their movements. The novelty of the work below is the introduction of the concepts of Eulerian optical flow and time-to-transit, tau and the demonstration that control laws based on the tau values associated with an aggregated set of features in the field of view can be used to reliably steer a laboratory robot.
2022
978-1-7281-9681-7
File in questo prodotto:
File Dimensione Formato  
Visual_Navigation_Using_Sparse_Optical_Flow_and_Time-to-Transit.pdf

non disponibili

Descrizione: Versione pubblicata
Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 3.93 MB
Formato Adobe PDF
3.93 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Visual_Navigation_Using_Sparse_Optical_Flow_and_Time-to-Transit_preprint.pdf

accesso aperto

Descrizione: post-print
Tipologia: 2. Post-print / Author's Accepted Manuscript
Licenza: PUBBLICO - Tutti i diritti riservati
Dimensione 3.89 MB
Formato Adobe PDF
3.89 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2970216