Indoor environment reconstruction is a challenging task in Computer Vision and Computer Graphics, especially when Extended Reality (XR) technologies are considered. Current solutions that employ dedicated depth sensors require scanning of the environment and tend to suffer from low resolution and noise, whereas solutions that rely on a single photo of a scene cannot predict the actual position and scale of objects due to scale ambiguity. The proposed system addresses these limitations by allowing the user to capture single views of objects using an Android smartphone equipped with a single RGB camera and supported by Google ARCore. The system includes 1) an Android app tracking the smartphone’s position relative to the world, capturing a single RGB image for each object and estimating depth information of the scene, 2) a program running on a server that classifies the framed objects, retrieves the corresponding 3D models from a database and estimates their position, vertical rotation, and scale factor without deforming the shape. The system has been assessed measuring the translational, rotational and scaling errors of the considered objects with respect to the physical ones acting as a ground truth. The main outcomes show that the proposed solution obtains a maximum error of 18% for the scaling factor, less than nine centimeters for the position and less than 18° for the rotation. These results suggest that the proposed system can be employed for XR applications, thus bridging the gap between the real and virtual worlds.

Snap2cad: 3D indoor environment reconstruction for AR/VR applications using a smartphone device / Manni, Alessandro; Oriti, Damiano; Sanna, Andrea; De Pace, Francesco; Manuri, Federico. - In: COMPUTERS & GRAPHICS. - ISSN 0097-8493. - STAMPA. - 100:(2021), pp. 116-124. [10.1016/j.cag.2021.07.014]

Snap2cad: 3D indoor environment reconstruction for AR/VR applications using a smartphone device

Oriti, Damiano;Sanna, Andrea;De Pace, Francesco;Manuri, Federico
2021

Abstract

Indoor environment reconstruction is a challenging task in Computer Vision and Computer Graphics, especially when Extended Reality (XR) technologies are considered. Current solutions that employ dedicated depth sensors require scanning of the environment and tend to suffer from low resolution and noise, whereas solutions that rely on a single photo of a scene cannot predict the actual position and scale of objects due to scale ambiguity. The proposed system addresses these limitations by allowing the user to capture single views of objects using an Android smartphone equipped with a single RGB camera and supported by Google ARCore. The system includes 1) an Android app tracking the smartphone’s position relative to the world, capturing a single RGB image for each object and estimating depth information of the scene, 2) a program running on a server that classifies the framed objects, retrieves the corresponding 3D models from a database and estimates their position, vertical rotation, and scale factor without deforming the shape. The system has been assessed measuring the translational, rotational and scaling errors of the considered objects with respect to the physical ones acting as a ground truth. The main outcomes show that the proposed solution obtains a maximum error of 18% for the scaling factor, less than nine centimeters for the position and less than 18° for the rotation. These results suggest that the proposed system can be employed for XR applications, thus bridging the gap between the real and virtual worlds.
File in questo prodotto:
File Dimensione Formato  
Snap2cad.pdf

non disponibili

Descrizione: Articolo principale
Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 2.28 MB
Formato Adobe PDF
2.28 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
author_post_print_.pdf

Open Access dal 25/07/2023

Descrizione: Articolo principale (post-print autore)
Tipologia: 2. Post-print / Author's Accepted Manuscript
Licenza: Creative commons
Dimensione 25.63 MB
Formato Adobe PDF
25.63 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2915084